Jump to content

Krichevsky–Trofimov estimator

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 86.29.216.15 (talk) at 17:50, 13 May 2023 (Make the link to the mean of a Beta-Bernoulli distribution; correct the Dirichlet-Multinomial reference, this should be Dirichlet-Categorical). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In information theory, given an unknown stationary source π with alphabet A and a sample w from π, the Krichevsky–Trofimov (KT) estimator produces an estimate pi(w) of the probability of each symbol i ∈ A. This estimator is optimal in the sense that it minimizes the worst-case regret asymptotically.

For a binary alphabet and a string w with m zeroes and n ones, the KT estimator pi(w) is defined as:[1]

This corresponds to the posterior mean of a Beta-Bernoulli posterior distribution with prior . For the general case the estimate is made using a Dirichlet-Categorical distribution.

See also

References

  1. ^ Krichevsky, R. E.; Trofimov, V. K. (1981). "The Performance of Universal Encoding". IEEE Trans. Inf. Theory. IT-27 (2): 199–207. doi:10.1109/TIT.1981.1056331.