# hartley (unit)

(Redirected from Dit (information))

The hartley (symbol Hart), earlier called a ban, or a dit (short for decimal digit), is a logarithmic unit which measures information or entropy, based on base 10 logarithms and powers of 10, rather than the powers of 2 and base 2 logarithms which define the bit, or shannon. One hartley is the information content of an event if the probability of that event occurring is 1/10.[1] It is therefore equal to the information contained in one decimal digit (or dit), assuming a priori equiprobability of each possible value.

As a bit corresponds to a binary digit, so a ban is a decimal digit. A deciban is one tenth of a ban; the name is formed from ban by the SI prefix deci-.

One Hart corresponds to log2(10) bit = ln(10) nat, or approximately 3.322 Sh,[a] or 2.303 nat. A deciban is about 0.332 Sh.

Though not an SI unit, the hartley is part of the International System of Quantities, defined by International Standard IEC 80000-13 of the International Electrotechnical Commission. It is named after Ralph Hartley. It supersedes[citation needed] the ban, an earlier name for the same unit.

## History

The ban and the deciban were invented by Alan Turing with I. J. Good in 1940, to measure the amount of information that could be deduced by the codebreakers at Bletchley Park using the Banburismus procedure, towards determining each day's unknown setting of the German naval Enigma cipher machine. The name was inspired by the enormous sheets of card, printed in the town of Banbury about 30 miles away, that were used in the process.[2]

Jack Good argued that the sequential summation of decibans to build up a measure of the weight of evidence in favour of a hypothesis, is essentially Bayesian inference.[2] Donald A. Gillies, however, argued the ban is, in effect, the same as Karl Popper's measure of the severity of a test.[3]

The term hartley is after Ralph Hartley, who suggested this unit in 1928.[4][5]

The ban pre-dates Shannon's use of bit as a unit of information by at least eight years, and remains in use in the early 21st Century.[6] In the International System of Quantities it is replaced by the hartley.

## Usage as a unit of odds

The deciban is a particularly useful unit for log-odds, notably as a measure of information in Bayes factors, odds ratios (ratio of odds, so log is difference of log-odds), or weights of evidence. 10 decibans corresponds to odds of 10:1; 20 decibans to 100:1 odds, etc. According to I. J. Good, a change in a weight of evidence of 1 deciban (i.e., a change in the odds from evens to about 5:4) is about as finely as humans can reasonably be expected to quantify their degree of belief in a hypothesis.[7]

Odds corresponding to integer decibans can often be well-approximated by simple integer ratios; these are collated below. Value to two decimal places, simple approximation (to within about 5%), with more accurate approximation (to within 1%) if simple one is inaccurate:

decibans exact
value
approx.
value
approx.
ratio
accurate
ratio
probability
0 100/10 1 1:1 50%
1 101/10 1.26 5:4 56%
2 102/10 1.58 3:2 8:5 61%
3 103/10 2.00 2:1 67%
4 104/10 2.51 5:2 71.5%
5 105/10 3.16 3:1 19:6, 16:5 76%
6 106/10 3.98 4:1 80%
7 107/10 5.01 5:1 83%
8 108/10 6.31 6:1 19:3, 25:4 86%
9 109/10 7.94 8:1 89%
10 1010/10 10 10:1 91%

## Notes

1. ^ This value, approximately 10/3, but slightly less, can be understood simply because ${\displaystyle 10^{3}=1,000\lesssim 1,024=2^{10}}$: 3 decimal digits are slightly less information than 10 binary digits, so 1 decimal digit is slightly less than 10/3 binary digits.

## References

1. ^ "IEC 80000-13:2008". International Organization for Standardization. Retrieved 21 July 2013.
2. ^ a b Good, I.J. (1979). "Studies in the History of Probability and Statistics. XXXVII A. M. Turing's statistical work in World War II". Biometrika. 66 (2): 393–396. doi:10.1093/biomet/66.2.393. MR 0548210.
3. ^ Gillies, Donald A. (1990). "The Turing-Good Weight of Evidence Function and Popper's Measure of the Severity of a Test". British Journal for the Philosophy of Science. 41 (1): 143–146. doi:10.1093/bjps/41.1.143. JSTOR 688010. MR 055678.
4. ^ Hartley, R.V.L. (July 1928). "Transmission of Information" (PDF). Bell System Technical Journal. VII (3): 535–563. Retrieved 2008-03-27.
5. ^ Reza, Fazlollah M. An Introduction to Information Theory. New York: Dover, 1994. ISBN 0-486-68210-2.
6. ^ "GCHQ boss: Crypto-genius Turing brought tech to British spooks". Retrieved 2013-07-08.
7. ^ Good, I.J. (1985). "Weight of Evidence: A Brief Survey" (PDF). Bayesian Statistics. 2: 253. Retrieved 2012-12-13.