HOME

TheInfoList



OR:

The concept of a random sequence is essential in
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
and
statistics Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
. The concept generally relies on the notion of a
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is cal ...
of
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
s and many statistical discussions begin with the words "let ''X''1,...,''Xn'' be independent random variables...". Yet as D. H. Lehmer stated in 1951: "A random sequence is a vague notion... in which each term is unpredictable to the uninitiated and whose digits pass a certain number of tests traditional with statisticians". Axiomatic probability theory ''deliberately'' avoids a definition of a random sequence. Traditional probability theory does not state if a specific sequence is random, but generally proceeds to discuss the properties of random variables and stochastic sequences assuming some definition of randomness. The Bourbaki school considered the statement "let us consider a random sequence" an abuse of language.


Early history

Émile Borel Félix Édouard Justin Émile Borel (; 7 January 1871 – 3 February 1956) was a French people, French mathematician and politician. As a mathematician, he was known for his founding work in the areas of measure theory and probability. Biograp ...
was one of the first mathematicians to formally address randomness in 1909. In 1919 Richard von Mises gave the first definition of algorithmic randomness, which was inspired by the law of large numbers, although he used the term ''collective'' rather than random sequence. Using the concept of the impossibility of a gambling system, von Mises defined an infinite sequence of zeros and ones as random if it is not biased by having the ''frequency stability property'' i.e. the frequency of zeros goes to 1/2 and every sub-sequence we can select from it by a "proper" method of selection is also not biased. The sub-sequence selection criterion imposed by von Mises is important, because although 0101010101... is not biased, by selecting the odd positions, we get 000000... which is not random. Von Mises never totally formalized his definition of a proper selection rule for sub-sequences, but in 1940
Alonzo Church Alonzo Church (June 14, 1903 – August 11, 1995) was an American computer scientist, mathematician, logician, and philosopher who made major contributions to mathematical logic and the foundations of theoretical computer science. He is bes ...
defined it as any recursive function which having read the first N elements of the sequence decides if it wants to select element number ''N'' + 1. Church was a pioneer in the field of computable functions, and the definition he made relied on the Church Turing Thesis for computability. This definition is often called ''Mises–Church randomness''.


Modern approaches

During the 20th century various technical approaches to defining random sequences were developed and now three distinct paradigms can be identified. In the mid 1960s, A. N. Kolmogorov and D. W. Loveland independently proposed a more permissive selection rule. In their view Church's recursive function definition was too restrictive in that it read the elements in order. Instead they proposed a rule based on a partially computable process which having read ''any'' ''N'' elements of the sequence, decides if it wants to select another element which has not been read yet. This definition is often called ''Kolmogorov–Loveland stochasticity''. But this method was considered too weak by Alexander Shen who showed that there is a Kolmogorov–Loveland stochastic sequence which does not conform to the general notion of randomness. In 1966 Per Martin-Löf introduced a new notion which is now generally considered the most satisfactory notion of algorithmic randomness. His original definition involved measure theory, but it was later shown that it can be expressed in terms of Kolmogorov complexity. Kolmogorov's definition of a random string was that it is random if it has no description shorter than itself via a universal Turing machine. Three basic paradigms for dealing with random sequences have now emerged: :* The ''frequency / measure-theoretic'' approach. This approach started with the work of Richard von Mises and Alonzo Church. In the 1960s Per Martin-Löf noticed that the sets coding such frequency-based stochastic properties are a special kind of
measure zero In mathematical analysis, a null set is a Lebesgue measurable set of real numbers that has Lebesgue measure, measure zero. This can be characterized as a set that can be Cover (topology), covered by a countable union of Interval (mathematics), ...
sets, and that a more general and smooth definition can be obtained by considering all effectively measure zero sets. :* The ''complexity / compressibility'' approach. This paradigm was championed by A. N. Kolmogorov along with contributions from Leonid Levin and
Gregory Chaitin Gregory John Chaitin ( ; born 25 June 1947) is an Argentina, Argentine-United States, American mathematician and computer scientist. Beginning in the late 1960s, Chaitin made contributions to algorithmic information theory and metamathematics, ...
. For finite sequences, Kolmogorov defines randomness of a binary string of length ''n'' as the entropy (or Kolmogorov complexity) normalized by the length ''n''. In other words, if the Kolmogorov complexity of the string is close to ''n'', it is very random; if the complexity is far below ''n'', it is not so random. The dual concept of randomness is compressibility ‒ the more random a sequence is, the less compressible, and vice versa. :* The ''predictability'' approach. This paradigm is due to Claus P. Schnorr and uses a slightly different definition of constructive martingales than martingales used in traditional probability theory. Schnorr showed how the existence of a selective betting strategy implied the existence of a selection rule for a biased sub-sequence. If one only requires a recursive martingale to succeed on a sequence instead of constructively succeed on a sequence, then one gets the concept of recursive randomness. Yongge Wang showed that recursive randomness concept is different from Schnorr's randomness concept. In most cases, theorems relating the three paradigms (often equivalence) have been proven.Wolfgang Merkle, ''Kolmogorov Loveland Stochasticity'' in Automata, languages and programming: 29th international colloquium, ICALP 2002, by Peter Widmayer et al. page 391


See also

*
Randomness In common usage, randomness is the apparent or actual lack of definite pattern or predictability in information. A random sequence of events, symbols or steps often has no order and does not follow an intelligible pattern or combination. ...
* History of randomness * Random number generator * Seven states of randomness * Statistical randomness


References

* Sergio B. Volcha
''What Is a Random Sequence?''
''The American Mathematical Monthly'', Vol. 109, 2002, pp. 46–63


Notes


External links

*
Video
on frequency stability. Why humans can't "guess" randomly
Randomness tests by Terry Ritter
{{DEFAULTSORT:Random Sequence Sequences and series Statistical randomness