HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
, an empirical measure is a random measure arising from a particular realization of a (usually finite) sequence of
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s. The precise definition is found below. Empirical measures are relevant to
mathematical statistics Mathematical statistics is the application of probability theory, a branch of mathematics, to statistics, as opposed to techniques for collecting statistical data. Specific mathematical techniques which are used for this include mathematical an ...
. The motivation for studying empirical measures is that it is often impossible to know the true underlying
probability measure In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as ''countable additivity''. The difference between a probability measure and the more ge ...
P. We collect observations X_1, X_2, \dots , X_n and compute
relative frequencies In statistics, the frequency (or absolute frequency) of an event i is the number n_i of times the observation has occurred/recorded in an experiment or study. These frequencies are often depicted graphically or in tabular form. Types The cumul ...
. We can estimate P, or a related distribution function F by means of the empirical measure or empirical distribution function, respectively. These are uniformly good estimates under certain conditions. Theorems in the area of empirical processes provide rates of this convergence.


Definition

Let X_1, X_2, \dots be a sequence of
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independe ...
identically distributed
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s with values in the state space ''S'' with probability distribution ''P''. Definition :The ''empirical measure'' ''P''''n'' is defined for measurable subsets of ''S'' and given by ::P_n(A) = \sum_^n I_A(X_i)=\frac\sum_^n \delta_(A) ::where I_A is the
indicator function In mathematics, an indicator function or a characteristic function of a subset of a set is a function that maps elements of the subset to one, and all other elements to zero. That is, if is a subset of some set , one has \mathbf_(x)=1 if x\i ...
and \delta_X is the Dirac measure. Properties *For a fixed measurable set ''A'', ''nP''''n''(''A'') is a
binomial Binomial may refer to: In mathematics *Binomial (polynomial), a polynomial with two terms *Binomial coefficient, numbers appearing in the expansions of powers of binomials *Binomial QMF, a perfect-reconstruction orthogonal wavelet decomposition * ...
random variable with mean ''nP''(''A'') and variance ''nP''(''A'')(1 − ''P''(''A'')). **In particular, ''P''''n''(''A'') is an unbiased estimator of ''P''(''A''). *For a fixed partition A_i of ''S'', random variables X_i=nP_n(A_i) form a multinomial distribution with ''event probabilities'' P(A_i) **The
covariance matrix In probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of ...
of this multinomial distribution is Cov(X_i,X_j)=nP(A_i)(\delta_-P(A_j)). Definition :\bigl(P_n(c)\bigr)_ is the ''empirical measure'' indexed by \mathcal, a collection of measurable subsets of ''S''. To generalize this notion further, observe that the empirical measure P_n maps
measurable function In mathematics and in particular measure theory, a measurable function is a function between the underlying sets of two measurable spaces that preserves the structure of the spaces: the preimage of any measurable set is measurable. This is in ...
s f:S\to \mathbb to their ''
empirical mean The sample mean (or "empirical mean") and the sample covariance are statistics computed from a sample of data on one or more random variables. The sample mean is the average value (or mean value) of a sample of numbers taken from a larger popu ...
'', :f\mapsto P_n f=\int_S f \, dP_n=\frac\sum_^n f(X_i) In particular, the empirical measure of ''A'' is simply the empirical mean of the indicator function, ''P''''n''(''A'') = ''P''''n'' ''I''''A''. For a fixed measurable function f, P_nf is a random variable with mean \mathbbf and variance \frac\mathbb(f -\mathbb f)^2. By the strong
law of large numbers In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials shou ...
, ''P''n(''A'') converges to ''P''(''A'')
almost surely In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0. ...
for fixed ''A''. Similarly P_nf converges to \mathbb f almost surely for a fixed measurable function f. The problem of uniform convergence of ''P''''n'' to ''P'' was open until
Vapnik Vladimir Naumovich Vapnik (russian: Владимир Наумович Вапник; born 6 December 1936) is one of the main developers of the Vapnik–Chervonenkis theory of statistical learning, and the co-inventor of the support-vector machin ...
and
Chervonenkis Alexey Yakovlevich Chervonenkis (russian: link=no, Алексей Яковлевич Червоненкис; 7 September 1938 – 22 September 2014) was a Soviet and Russian mathematician. Along with Vladimir Vapnik, he was one of the main develop ...
solved it in 1968. If the class \mathcal (or \mathcal) is Glivenko–Cantelli with respect to ''P'' then ''P''''n'' converges to ''P'' uniformly over c\in\mathcal (or f\in \mathcal). In other words, with probability 1 we have :\, P_n-P\, _\mathcal=\sup_, P_n(c)-P(c), \to 0, :\, P_n-P\, _\mathcal=\sup_, P_nf-\mathbbf, \to 0.


Empirical distribution function

The ''empirical distribution function'' provides an example of empirical measures. For real-valued iid random variables X_1,\dots,X_n it is given by :F_n(x)=P_n((-\infty,x])=P_nI_. In this case, empirical measures are indexed by a class \mathcal=\. It has been shown that \mathcal is a uniform Glivenko–Cantelli class, in particular, :\sup_F\, F_n(x)-F(x)\, _\infty\to 0 with probability 1.


See also

* Empirical risk minimization *
Poisson random measure Let (E, \mathcal A, \mu) be some measure space with \sigma- finite measure \mu. The Poisson random measure with intensity measure \mu is a family of random variables \_ defined on some probability space (\Omega, \mathcal F, \mathrm) such that i ...


References


Further reading

* * * * *{{cite journal , first=J. , last=Wolfowitz , title=Generalization of the theorem of Glivenko–Cantelli , journal=Annals of Mathematical Statistics , volume=25 , issue=1 , pages=131–138 , year=1954 , jstor=2236518 , doi=10.1214/aoms/1177728852, doi-access=free Measures (measure theory) Empirical process