Information, in its general sense, is "
Knowledge communicated or received concerning a particular fact or circumstance".
Information cannot be predicted and resolves uncertainty.
The uncertainty of an event is measured by its probability of
occurrence and is inversely proportional to that. The more uncertain an
event is more information is required to resolve uncertainty of that
event. The amount of information is measured in bits.
Example: information in one "fair" coin flip: log
2(2/1) = 1 bit, and in two fair coin flips is log
2(4/1) = 2 bits.
No comments:
Post a Comment