Le Cam's Theorem
   HOME
*





Le Cam's Theorem
In probability theory, Le Cam's theorem, named after Lucien Le Cam (1924 – 2000), states the following. Suppose: * X_1, X_2, X_3, \ldots are independent random variables, each with a Bernoulli distribution (i.e., equal to either 0 or 1), not necessarily identically distributed. * \Pr(X_i = 1) = p_i, \text i = 1, 2, 3, \ldots. * \lambda_n = p_1 + \cdots + p_n. * S_n = X_1 + \cdots + X_n. (i.e. S_n follows a Poisson binomial distribution) Then :\sum_^\infty \left, \Pr(S_n=k) - \ < 2 \left( \sum_^n p_i^2 \right). In other words, the sum has approximately a and the above inequality bounds the approximation error in terms of the

picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Lucien Le Cam
Lucien Marie Le Cam (November 18, 1924 – April 25, 2000) was a mathematician and statistician. Biography Le Cam was born November 18, 1924 in Croze, France. His parents were farmers, and unable to afford higher education for him; his father died when he was 13. After graduating from a Catholic school in 1942, he began studying at a seminary in Limoges, but immediately quit upon learning that he would not be allowed to study chemistry there. Instead he continued his studies at a lycée, which did not teach chemistry but did teach mathematics. In May 1944 he joined an underground group, and then went into hiding, returning to his school the following November but soon afterwards moving to Paris, where he began studying at the University of Paris. He graduated in 1945 with the degree ''Licence ès Sciences''.. Le Cam then worked for a hydroelectric utility for five years, while meeting at the University of Paris for a weekly seminar in statistics. In 1950, he was invited to become ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Statistical Independence
Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes. Two events are independent, statistically independent, or stochastically independent if, informally speaking, the occurrence of one does not affect the probability of occurrence of the other or, equivalently, does not affect the odds. Similarly, two random variables are independent if the realization of one does not affect the probability distribution of the other. When dealing with collections of more than two events, two notions of independence need to be distinguished. The events are called pairwise independent if any two events in the collection are independent of each other, while mutual independence (or collective independence) of events means, informally speaking, that each event is independent of any combination of other events in the collection. A similar notion exists for collections of random variables. Mutual independence implies pairwise independence ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Random Variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the possible upper sides of a flipped coin such as heads H and tails T) in a sample space (e.g., the set \) to a measurable space, often the real numbers (e.g., \ in which 1 corresponding to H and -1 corresponding to T). Informally, randomness typically represents some fundamental element of chance, such as in the roll of a dice; it may also represent uncertainty, such as measurement error. However, the interpretation of probability is philosophically complicated, and even in specific cases is not always straightforward. The purely mathematical analysis of random variables is independent of such interpretational difficulties, and can be based upon a rigorous axiomatic setup. In the formal mathematical language of measure theory, a random var ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Bernoulli Distribution
In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli,James Victor Uspensky: ''Introduction to Mathematical Probability'', McGraw-Hill, New York 1937, page 45 is the discrete probability distribution of a random variable which takes the value 1 with probability p and the value 0 with probability q = 1-p. Less formally, it can be thought of as a model for the set of possible outcomes of any single experiment that asks a yes–no question. Such questions lead to outcomes that are boolean-valued: a single bit whose value is success/ yes/true/ one with probability ''p'' and failure/no/ false/zero with probability ''q''. It can be used to represent a (possibly biased) coin toss where 1 and 0 would represent "heads" and "tails", respectively, and ''p'' would be the probability of the coin landing on heads (or vice versa where 1 would represent tails and ''p'' would be the probability of tails). In particular, unfair coins ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Poisson Binomial Distribution
In probability theory and statistics, the Poisson binomial distribution is the discrete probability distribution of a sum of independent Bernoulli trials that are not necessarily identically distributed. The concept is named after Siméon Denis Poisson. In other words, it is the probability distribution of the number of successes in a collection of ''n'' independent yes/no experiments with success probabilities p_1, p_2, \dots , p_n. The ordinary binomial distribution is a special case of the Poisson binomial distribution, when all success probabilities are the same, that is p_1 = p_2 = \cdots = p_n. Definitions Probability Mass Function The probability of having ''k'' successful trials out of a total of ''n'' can be written as the sum :\Pr(K=k) = \sum\limits_ \prod\limits_ p_i \prod\limits_ (1-p_j) where F_k is the set of all subsets of ''k'' integers that can be selected from . For example, if ''n'' = 3, then F_2=\left\. A^c is the complement of A, i.e. A^c ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Poisson Distribution
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and Statistical independence, independently of the time since the last event. It is named after France, French mathematician Siméon Denis Poisson (; ). The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume. For instance, a call center receives an average of 180 calls per hour, 24 hours a day. The calls are independent; receiving one does not change the probability of when the next one will arrive. The number of calls received during any minute has a Poisson probability distribution with mean 3: the most likely numbers are 2 and 3 but 1 and 4 are also likely and there is a small probability of it being as low as zero and a very smal ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Total Variation Distance Of Probability Measures
In probability theory, the total variation distance is a distance measure for probability distributions. It is an example of a statistical distance metric, and is sometimes called the statistical distance, statistical difference or variational distance. Definition Consider a measurable space (\Omega, \mathcal) and probability measures P and Q defined on (\Omega, \mathcal). The total variation distance between P and Q is defined as: :\delta(P,Q)=\sup_\left, P(A)-Q(A)\. Informally, this is the largest possible difference between the probabilities that the two probability distributions can assign to the same event. Properties Relation to other distances The total variation distance is related to the Kullback–Leibler divergence by Pinsker’s inequality: :\delta(P,Q) \le \sqrt. One also has the following inequality, due to Bretagnolle and Huber (see, also, Tsybakov), which has the advantage of providing a non-vacuous bound even when D_(P\parallel Q)>2: :\delta(P,Q) \le \s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Poisson Limit Theorem
In probability theory, the law of rare events or Poisson limit theorem states that the Poisson distribution may be used as an approximation to the binomial distribution, under certain conditions. The theorem was named after Siméon Denis Poisson (1781–1840). A generalization of this theorem is Le Cam's theorem. Theorem Let p_n be a sequence of real numbers in ,1 such that the sequence n p_n converges to a finite limit \lambda . Then: :\lim_ p_n^k (1-p_n)^ = e^\frac Proofs : \begin \lim\limits_ p_n^k (1-p_n)^ &\simeq \lim_\frac \left(\frac\right)^k \left(1- \frac\right)^ \\ &= \lim_\frac\frac \left(1- \frac\right)^ \\ &= \lim_\frac \left(1-\frac\right)^ \end . Since : \lim_ \left(1-\frac\right)^ = e^ and : \lim_ \left(1- \frac\right)^=1 This leaves :p^k (1-p)^ \simeq \frac. Alternative proof Using Stirling's approximation, we can write: : \begin p^k (1-p)^ &= \frac p^k (1-p)^ \\ &\simeq \frac p^k (1-p)^ \\ &= \sqrt\fracp^k (1-p)^. \end Letting n \to \infty an ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theorems
Probability is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility of the event and 1 indicates certainty."Kendall's Advanced Theory of Statistics, Volume 1: Distribution Theory", Alan Stuart and Keith Ord, 6th Ed, (2009), .William Feller, ''An Introduction to Probability Theory and Its Applications'', (Vol 1), 3rd Ed, (1968), Wiley, . The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes ("heads" and "tails") are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2 (which could also be written as 0.5 or 50%). These conce ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probabilistic Inequalities
Probability is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility of the event and 1 indicates certainty."Kendall's Advanced Theory of Statistics, Volume 1: Distribution Theory", Alan Stuart and Keith Ord, 6th Ed, (2009), .William Feller, ''An Introduction to Probability Theory and Its Applications'', (Vol 1), 3rd Ed, (1968), Wiley, . The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes ("heads" and "tails") are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2 (which could also be written as 0.5 or 50%). These conc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Statistical Inequalities
Statistics (from German: ''Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industrial, or social problem, it is conventional to begin with a statistical population or a statistical model to be studied. Populations can be diverse groups of people or objects such as "all people living in a country" or "every atom composing a crystal". Statistics deals with every aspect of data, including the planning of data collection in terms of the design of surveys and experiments.Dodge, Y. (2006) ''The Oxford Dictionary of Statistical Terms'', Oxford University Press. When census data cannot be collected, statisticians collect data by developing specific experiment designs and survey samples. Representative sampling assures that inferences and conclusions can reasonably extend from the sample to the population as a whole. An expe ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]