shannon (unit)
Units of information |
Information-theoretic |
---|
Data storage |
Quantum information |
The shannon (symbol: Sh), more commonly known as the bit, is a unit of information and of entropy defined by IEC 80000-13. One shannon is the information content of an event occurring when its probability is 1⁄2.[1] It is also the entropy of a system with two equally probable states. If a message is made of a sequence of a given number of bits, with all possible bit strings being equally likely, the message's information content expressed in shannons is equal to the number of bits in the sequence.[2] For this and historical reasons, the shannon is more commonly known as the bit. The introduction[when?] of the term shannon provides an explicit distinction between the amount of information that is expressed and the quantity of data that may be used to represent the information. IEEE Std 260.1-2004 still defines the unit for this meaning as the bit, with no mention of the shannon.
The shannon can be converted to other information units according to
The shannon is named after Claude Shannon, the founder of information theory.
See also
References
- ^ "IEC 80000-13:2008". International Organization for Standardization. Retrieved 21 July 2013.
- ^ "shannon", A Dictionary of Units of Measurement