Chentsov’s Theorem
   HOME
*





Chentsov’s Theorem
In information geometry, Chentsov's theorem states that the Fisher information metric is, up to rescaling, the unique Riemannian metric on a statistical manifold that is invariant under sufficient statistics. See also *Fisher information *Sufficient statistic *Information geometry Information geometry is an interdisciplinary field that applies the techniques of differential geometry to study probability theory and statistics. It studies statistical manifolds, which are Riemannian manifolds whose points correspond to prob ... References * N. N. Čencov (1981), ''Statistical Decision Rules and Optimal Inference'', Translations of mathematical monographs; v. 53, American Mathematical Society, http://www.ams.org/books/mmono/053/ * Shun'ichi Amari, Hiroshi Nagaoka (2000) ''Methods of information geometry'', Translations of mathematical monographs; v. 191, American Mathematical Society, http://www.ams.org/books/mmono/191/ (Theorem 2.6) * {{DEFAULTSORT:Chentsov's theorem Diffe ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Geometry
Information geometry is an interdisciplinary field that applies the techniques of differential geometry to study probability theory and statistics. It studies statistical manifolds, which are Riemannian manifolds whose points correspond to probability distributions. Introduction Historically, information geometry can be traced back to the work of C. R. Rao, who was the first to treat the Fisher matrix as a Riemannian metric. The modern theory is largely due to Shun'ichi Amari, whose work has been greatly influential on the development of the field. Classically, information geometry considered a parametrized statistical model as a Riemannian manifold. For such models, there is a natural choice of Riemannian metric, known as the Fisher information metric. In the special case that the statistical model is an exponential family, it is possible to induce the statistical manifold with a Hessian metric (i.e a Riemannian metric given by the potential of a convex function). In thi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Fisher Information Metric
In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, ''i.e.'', a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements. The metric is interesting in several respects. By Chentsov’s theorem, the Fisher information metric on statistical models is the only Riemannian metric (up to rescaling) that is invariant under sufficient statistics. It can also be understood to be the infinitesimal form of the relative entropy (''i.e.'', the Kullback–Leibler divergence); specifically, it is the Hessian of the divergence. Alternately, it can be understood as the metric induced by the flat space Euclidean metric, after appropriate changes of variable. When extended to complex projective Hilbert space, it becomes the Fubini–Study metric; when written in terms of mixed states, it is the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Riemannian Metric
In differential geometry, a Riemannian manifold or Riemannian space , so called after the German mathematician Bernhard Riemann, is a real, smooth manifold ''M'' equipped with a positive-definite inner product ''g''''p'' on the tangent space ''T''''p''''M'' at each point ''p''. The family ''g''''p'' of inner products is called a Riemannian metric (or Riemannian metric tensor). Riemannian geometry is the study of Riemannian manifolds. A common convention is to take ''g'' to be smooth, which means that for any smooth coordinate chart on ''M'', the ''n''2 functions :g\left(\frac,\frac\right):U\to\mathbb are smooth functions. These functions are commonly designated as g_. With further restrictions on the g_, one could also consider Lipschitz Riemannian metrics or measurable Riemannian metrics, among many other possibilities. A Riemannian metric (tensor) makes it possible to define several geometric notions on a Riemannian manifold, such as angle at an intersection, length of a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Statistical Manifold
In mathematics, a statistical manifold is a Riemannian manifold, each of whose points is a probability distribution. Statistical manifolds provide a setting for the field of information geometry. The Fisher information metric provides a metric on these manifolds. Following this definition, the log-likelihood function is a differentiable map and the score is an inclusion. Examples The family of all normal distributions can be thought of as a 2-dimensional parametric space parametrized by the expected value ''μ'' and the variance ''σ''2 ≥ 0. Equipped with the Riemannian metric given by the Fisher information matrix, it is a statistical manifold with a geometry modeled on hyperbolic space. A way of picturing the manifold is done by inferring the parametric equations via the Fisher Information rather than starting from the likelihood-function. A simple example of a statistical manifold, taken from physics, would be the canonical ensemble: it is a one-dimensional ma ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Sufficient Statistics
In statistics, a statistic is ''sufficient'' with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter". In particular, a statistic is sufficient for a family of probability distributions if the sample from which it is calculated gives no additional information than the statistic, as to which of those probability distributions is the sampling distribution. A related concept is that of linear sufficiency, which is weaker than ''sufficiency'' but can be applied in some cases where there is no sufficient statistic, although it is restricted to linear estimators. The Kolmogorov structure function deals with individual finite data; the related notion there is the algorithmic sufficient statistic. The concept is due to Sir Ronald Fisher in 1920. Stephen Stigler noted in 1973 that the concept of sufficiency had fallen out of favor in de ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Fisher Information
In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable ''X'' carries about an unknown parameter ''θ'' of a distribution that models ''X''. Formally, it is the variance of the score, or the expected value of the observed information. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families). The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth). The Fisher information is also used in the calculation of the Jeffreys prior, which is used in Bayesian statistics. The Fisher information matrix is used to calculate the covariance matrices associat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Sufficient Statistic
In statistics, a statistic is ''sufficient'' with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter". In particular, a statistic is sufficient for a family of probability distributions if the sample from which it is calculated gives no additional information than the statistic, as to which of those probability distributions is the sampling distribution. A related concept is that of linear sufficiency, which is weaker than ''sufficiency'' but can be applied in some cases where there is no sufficient statistic, although it is restricted to linear estimators. The Kolmogorov structure function deals with individual finite data; the related notion there is the algorithmic sufficient statistic. The concept is due to Sir Ronald Fisher in 1920. Stephen Stigler noted in 1973 that the concept of sufficiency had fallen out of favor in des ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Information Geometry
Information geometry is an interdisciplinary field that applies the techniques of differential geometry to study probability theory and statistics. It studies statistical manifolds, which are Riemannian manifolds whose points correspond to probability distributions. Introduction Historically, information geometry can be traced back to the work of C. R. Rao, who was the first to treat the Fisher matrix as a Riemannian metric. The modern theory is largely due to Shun'ichi Amari, whose work has been greatly influential on the development of the field. Classically, information geometry considered a parametrized statistical model as a Riemannian manifold. For such models, there is a natural choice of Riemannian metric, known as the Fisher information metric. In the special case that the statistical model is an exponential family, it is possible to induce the statistical manifold with a Hessian metric (i.e a Riemannian metric given by the potential of a convex function). In thi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Differential Geometry
Differential geometry is a mathematical discipline that studies the geometry of smooth shapes and smooth spaces, otherwise known as smooth manifolds. It uses the techniques of differential calculus, integral calculus, linear algebra and multilinear algebra. The field has its origins in the study of spherical geometry as far back as antiquity. It also relates to astronomy, the geodesy of the Earth, and later the study of hyperbolic geometry by Lobachevsky. The simplest examples of smooth spaces are the plane and space curves and surfaces in the three-dimensional Euclidean space, and the study of these shapes formed the basis for development of modern differential geometry during the 18th and 19th centuries. Since the late 19th century, differential geometry has grown into a field concerned more generally with geometric structures on differentiable manifolds. A geometric structure is one which defines some notion of size, distance, shape, volume, or other rigidifying structu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]