Raikov's Theorem
   HOME
*





Raikov's Theorem
Raikov’s theorem, named for Russian mathematician Dmitrii Abramovich Raikov, is a result in probability theory. It is well known that if each of two independence (probability theory), independent random variables ξ1 and ξ2 has a Poisson distribution, then their sum ξ=ξ1+ξ2 has a Poisson distribution as well. It turns out that the converse is also valid. Statement of the theorem Suppose that a random variable ξ has Poisson's distribution and admits a decomposition as a sum ξ=ξ1+ξ2 of two independent random variables. Then the distribution of each summand is a shifted Poisson's distribution. Comment Raikov's theorem is similar to Cramér’s decomposition theorem. The latter result claims that if a sum of two independent random variables has normal distribution, then each summand is normally distributed as well. It was also proved by Yuri Linnik, Yu.V.Linnik that a convolution of normal distribution and Poisson's distribution possesses a similar property (). An exten ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Dmitrii Abramovich Raikov
Dmitrii Abramovich Raikov (Russian: , born 11 November 1905 in Odessa; died 1980 in Moscow) was a Russian mathematician who studied functional analysis. Raikov studied in Odessa and Moscow, graduating in 1929. He was secretary of the Komsomol at Moscow State University and was active in the 1929–1930 campaign against the mathematician Dmitri Fyodorovich Egorov. At that time he and his fellow campaigners also rejected non-applied research, but this soon changed. In 1933, he was dismissed from the Communist Party on charges of Trotskyism and exiled to Voronezh, but was rehabilitated two years later and returned to Moscow. From 1938 to 1948, he was at the Mathematical Institute of the USSR Academy of Sciences, Academy of Sciences and in the Second World War in the militia. He was habilitated (Russian doctorate) in 1941 with Aleksandr Yakovlevich Khinchin at the Lomonosov University and in 1950 became professor. He taught at the Pedagogical Institute in Kostroma and from 1952 in Sh ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Independence (probability Theory)
Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes. Two events are independent, statistically independent, or stochastically independent if, informally speaking, the occurrence of one does not affect the probability of occurrence of the other or, equivalently, does not affect the odds. Similarly, two random variables are independent if the realization of one does not affect the probability distribution of the other. When dealing with collections of more than two events, two notions of independence need to be distinguished. The events are called pairwise independent if any two events in the collection are independent of each other, while mutual independence (or collective independence) of events means, informally speaking, that each event is independent of any combination of other events in the collection. A similar notion exists for collections of random variables. Mutual independence implies pairwise independence ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Random Variables
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the possible upper sides of a flipped coin such as heads H and tails T) in a sample space (e.g., the set \) to a measurable space, often the real numbers (e.g., \ in which 1 corresponding to H and -1 corresponding to T). Informally, randomness typically represents some fundamental element of chance, such as in the roll of a dice; it may also represent uncertainty, such as measurement error. However, the interpretation of probability is philosophically complicated, and even in specific cases is not always straightforward. The purely mathematical analysis of random variables is independent of such interpretational difficulties, and can be based upon a rigorous axiomatic setup. In the formal mathematical language of measure theory, a random vari ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Poisson Distribution
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and Statistical independence, independently of the time since the last event. It is named after France, French mathematician Siméon Denis Poisson (; ). The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume. For instance, a call center receives an average of 180 calls per hour, 24 hours a day. The calls are independent; receiving one does not change the probability of when the next one will arrive. The number of calls received during any minute has a Poisson probability distribution with mean 3: the most likely numbers are 2 and 3 but 1 and 4 are also likely and there is a small probability of it being as low as zero and a very smal ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Yuri Linnik
Yuri Vladimirovich Linnik (russian: Ю́рий Влади́мирович Ли́нник; January 8, 1915 – June 30, 1972) was a Soviet mathematician active in number theory, probability theory and mathematical statistics. Linnik was born in Bila Tserkva, in present-day Ukraine. He went to St Petersburg University where his supervisor was Vladimir Tartakovski, and later worked at that university and the Steklov Institute. He was a member of the Russian Academy of Sciences, as was his father, Vladimir Pavlovich Linnik. He was awarded both State and Lenin Prizes. He died in Leningrad. Work in number theory * Linnik's theorem in analytic number theory * The dispersion method (which allowed him to solve the Titchmarsh problem). * The large sieve (which turned out to be extremely influential). * An elementary proof of the Hilbert-Waring theorem; see also Schnirelmann density. * The Linnik ergodic method, see , which allowed him to study the distribution properties of the rep ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Characterization Of Probability Distributions
In mathematics in general, a characterization theorem says that a particular object – a function, a space, etc. – is the only one that possesses properties specified in the theorem. A characterization of a probability distribution accordingly states that it is the only probability distribution that satisfies specified conditions. More precisely, the model of characterization of probability distribution was described by in such manner. On the probability space we define the space \mathcal=\ of random variables with values in measurable metric space (U,d_) and the space \mathcal=\ of random variables with values in measurable metric space (V,d_). By characterizations of probability distributions we understand general problems of description of some set \mathcal in the space \mathcal by extracting the sets \mathcal \subseteq \mathcal and \mathcal \subseteq \mathcal which describe the properties of random variables X \in\mathcal and their images Y=\mathbfX \in \mathcal ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theorems
Probability is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility of the event and 1 indicates certainty."Kendall's Advanced Theory of Statistics, Volume 1: Distribution Theory", Alan Stuart and Keith Ord, 6th Ed, (2009), .William Feller, ''An Introduction to Probability Theory and Its Applications'', (Vol 1), 3rd Ed, (1968), Wiley, . The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes ("heads" and "tails") are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2 (which could also be written as 0.5 or 50%). These conce ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]