Catalog Of Articles In Probability Theory
   HOME
*





Catalog Of Articles In Probability Theory
This page lists articles related to probability theory. In particular, it lists many articles corresponding to specific probability distributions. Such articles are marked here by a code of the form (X:Y), which refers to number of random variables involved and the type of the distribution. For example (2:DC) indicates a distribution with two random variables, discrete or continuous. Other codes are just abbreviations for topics. The list of codes can be found in the table of contents. Core probability: selected topics Probability theory Basic notions (bsc) * Random variable * Continuous probability distribution / (1:C) * Cumulative distribution function / (1:DCR) * Discrete probability distribution / (1:D) * Independent and identically-distributed random variables / (FS:BDCR) * Joint probability distribution / (F:DC) * Marginal distribution / (2F:DC) * Probability density function / (1:C) * Probability distribution / (1:DCRG) * Pro ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Boy Or Girl Paradox
The Boy or Girl paradox surrounds a set of questions in probability theory, which are also known as The Two Child Problem, Mr. Smith's Children and the Mrs. Smith Problem. The initial formulation of the question dates back to at least 1959, when Martin Gardner featured it in his October 1959 "Mathematical Games column" in ''Scientific American''. He titled it The Two Children Problem, and phrased the paradox as follows: *Mr. Jones has two children. The older child is a girl. What is the probability that both children are girls? *Mr. Smith has two children. At least one of them is a boy. What is the probability that both children are boys? Gardner initially gave the answers and , respectively, but later acknowledged that the second question was ambiguous. Its answer could be , depending on the procedure by which the information "at least one of them is a boy" was obtained. The ambiguity, depending on the exact wording and possible assumptions, was confirmed by Maya Bar-Hil ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Coefficient Of Variation
In probability theory and statistics, the coefficient of variation (CV), also known as relative standard deviation (RSD), is a standardized measure of dispersion of a probability distribution or frequency distribution. It is often expressed as a percentage, and is defined as the ratio of the standard deviation \sigma to the mean \mu (or its absolute value, The CV or RSD is widely used in analytical chemistry to express the precision and repeatability of an assay. It is also commonly used in fields such as engineering or physics when doing quality assurance studies and ANOVA gauge R&R, by economists and investors in economic models, and in neuroscience. Definition The coefficient of variation (CV) is defined as the ratio of the standard deviation \ \sigma to the mean \ \mu , c_ = \frac. It shows the extent of variability in relation to the mean of the population. The coefficient of variation should be computed only for data measured on scales that have a meaningful zer ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Central Moment
In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean; that is, it is the expected value of a specified integer power of the deviation of the random variable from the mean. The various moments form one set of values by which the properties of a probability distribution can be usefully characterized. Central moments are used in preference to ordinary moments, computed in terms of deviations from the mean instead of from zero, because the higher-order central moments relate only to the spread and shape of the distribution, rather than also to its location. Sets of central moments can be defined for both univariate and multivariate distributions. Univariate moments The ''n''th moment about the mean (or ''n''th central moment) of a real-valued random variable ''X'' is the quantity ''μ''''n'' := E 'X''.html"_;"title="''X'' − E[''X''">''X'' − E[''X''''n'' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Carleman's Condition
In mathematics, particularly, in analysis, Carleman's condition gives a sufficient condition for the determinacy of the moment problem. That is, if a measure \mu satisfies Carleman's condition, there is no other measure \nu having the same moments as \mu. The condition was discovered by Torsten Carleman in 1922. Hamburger moment problem For the Hamburger moment problem (the moment problem on the whole real line), the theorem states the following: Let \mu be a measure on \R such that all the moments m_n = \int_^ x^n \, d\mu(x)~, \quad n = 0,1,2,\cdots are finite. If \sum_^\infty m_^ = + \infty, then the moment problem for (m_n) is ''determinate''; that is, \mu is the only measure on \R with (m_n) as its sequence of moments. Stieltjes moment problem For the Stieltjes moment problem In mathematics, the Stieltjes moment problem, named after Thomas Joannes Stieltjes, seeks necessary and sufficient conditions for a sequence (''m''0, ''m''1, ''m''2, ...) to be of the form :m_n = \in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Canonical Correlation
In statistics, canonical-correlation analysis (CCA), also called canonical variates analysis, is a way of inferring information from cross-covariance matrices. If we have two vectors ''X'' = (''X''1, ..., ''X''''n'') and ''Y'' = (''Y''1, ..., ''Y''''m'') of random variables, and there are correlations among the variables, then canonical-correlation analysis will find linear combinations of ''X'' and ''Y'' which have maximum correlation with each other. T. R. Knapp notes that "virtually all of the commonly encountered parametric tests of significance can be treated as special cases of canonical-correlation analysis, which is the general procedure for investigating the relationships between two sets of variables." The method was first introduced by Harold Hotelling in 1936, although in the context of angles between flats the mathematical concept was published by Jordan in 1875. Definition Given two column vectors X = (x_1, \dots, x_n)^T and ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Expected Value
In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a large number of independently selected outcomes of a random variable. The expected value of a random variable with a finite number of outcomes is a weighted average of all possible outcomes. In the case of a continuum of possible outcomes, the expectation is defined by integration. In the axiomatic foundation for probability provided by measure theory, the expectation is given by Lebesgue integration. The expected value of a random variable is often denoted by , , or , with also often stylized as or \mathbb. History The idea of the expected value originated in the middle of the 17th century from the study of the so-called problem of points, which seeks to divide the stakes ''in a fair way'' between two players, who have to end th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Two Envelopes Problem
The two envelopes problem, also known as the exchange paradox, is a paradox in probability theory. It is of special interest in decision theory, and for the Bayesian interpretation of probability theory. It is a variant of an older problem known as the necktie paradox. The problem is typically introduced by formulating a hypothetical challenge like the following example: Since the situation is symmetric, it seems obvious that there is no point in switching envelopes. On the other hand, a simple calculation using expected values suggests the opposite conclusion, that it is always beneficial to swap envelopes, since the person stands to gain twice as much money if they switch, while the only risk is halving what they currently have. Introduction Problem Basic setup: A person is given two indistinguishable envelopes, each of which contains a sum of money. One envelope contains twice as much as the other. The person may pick one envelope and keep whatever amount it contains. They ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Simpson's Paradox
Simpson's paradox is a phenomenon in probability and statistics in which a trend appears in several groups of data but disappears or reverses when the groups are combined. This result is often encountered in social-science and medical-science statistics, and is particularly problematic when frequency data are unduly given causal interpretations.Judea Pearl. ''Causality: Models, Reasoning, and Inference'', Cambridge University Press (2000, 2nd edition 2009). . The paradox can be resolved when confounding variables and causal relations are appropriately addressed in the statistical modeling. Simpson's paradox has been used to illustrate the kind of misleading results that the misuse of statistics can generate. Edward H. Simpson first described this phenomenon in a technical paper in 1951, but the statisticians Karl Pearson (in 1899) and Udny Yule (in 1903 ) had mentioned similar effects earlier. The name ''Simpson's paradox'' was introduced by Colin R. Blyth in 1972. It is also r ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Necktie Paradox
The necktie paradox is a puzzle and paradox with a subjective interpretation of probability theory describing a paradoxical bet advantageous to both involved parties. The two-envelope paradox is a variation of the necktie paradox. Statement of paradox Two persons, each given a necktie, start arguing over who has the cheaper one. The person with the more expensive necktie must give it to the other person. The first person reasons as follows: winning and losing are equally likely. If I lose, then I will lose the value of my necktie. But if I win, then I will win more than the value of my necktie. Therefore, the wager is to my advantage. The second person can consider the wager in exactly the same way; thus, paradoxically, it seems both persons have the advantage in the bet. Resolution using fluid intelligence The paradox can be resolved by giving more careful consideration to what is lost in one scenario ("the value of my necktie") and what is won in the other ("more than the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]