HOME

TheInfoList



OR:

In
information theory Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, given an unknown stationary source with alphabet ''A'' and a sample ''w'' from , the Krichevsky–Trofimov (KT) estimator produces an estimate ''p''''i''(''w'') of the probability of each symbol ''i'' ∈ ''A''. This estimator is optimal in the sense that it minimizes the worst-case
regret Regret is the emotion of wishing one had made a different decision in the past, because the consequences of the decision one did make were unfavorable. Regret is related to perceived opportunity. Its intensity varies over time after the decisi ...
asymptotically. For a binary alphabet and a string ''w'' with ''m'' zeroes and ''n'' ones, the KT estimator ''p''''i''(''w'') is defined as: : \begin p_0(w) &= \frac, \\ pt p_1(w) &= \frac. \end This corresponds to the posterior mean of a Beta-Bernoulli posterior distribution with prior 1/2. For the general case the estimate is made using a Dirichlet-Categorical distribution.


See also

*
Rule of succession In probability theory, the rule of succession is a formula introduced in the 18th century by Pierre-Simon Laplace in the course of treating the sunrise problem. The formula is still used, particularly to estimate underlying probabilities when ...
* Bayesian inference using conjugate priors for the categorical distribution


References

Information theory Data compression {{probability-stub