HOME
*





Slepian's Lemma
In probability theory, Slepian's lemma (1962), named after David Slepian, is a Gaussian comparison inequality. It states that for Gaussian random variables X = (X_1,\dots,X_n) and Y = (Y_1,\dots,Y_n) in \mathbb^n satisfying \operatorname E = \operatorname E = 0, :\operatorname E _i^2 \operatorname E _i^2 \quad i=1,\dots,n, \text \operatorname E _iX_j\le \operatorname E_i Y_j\text i \neq j. the following inequality holds for all real numbers u_1,\ldots,u_n: :\Pr\left bigcap_^n \\right\le \Pr\left bigcap_^n \\right or equivalently, :\Pr\left bigcup_^n \\right\ge \Pr\left bigcup_^n \\right While this intuitive-seeming result is true for Gaussian processes, it is not in general true for other random variables—not even those with expectation 0. As a corollary, if (X_t)_ is a centered stationary Gaussian process such that \operatorname E _0 X_t\geq 0 for all t, it holds for any real number c that :\Pr\left sup_ X_t \leq c\right\ge \Pr\left sup_ X_t \leq c\right\Pr \left sup ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probab ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


David Slepian
David S. Slepian (June 30, 1923 – November 29, 2007) was an American mathematician. He is best known for his work with algebraic coding theory, probability theory, and distributed source coding. He was colleagues with Claude Shannon and Richard Hamming at Bell Labs. Life and work Born in Pittsburgh, Pennsylvania, he gained a B.Sc. at University of Michigan before joining the US Army in World War II, as a sonic deception officer in the Ghost army. He received his Ph.D. from Harvard University in 1949, writing his dissertation in physics. After post-doctoral work at the University of Cambridge and University of Sorbonne, he worked at the Mathematics Research Center at Bell Telephone Laboratories, where he pioneered work in algebraic coding theory on group codes, first published in the paper ''A Class of Binary Signaling Alphabets''. Here, he also worked along with other information theory giants such as Claude Shannon and Richard Hamming. He also proved the possibili ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Stationary Process
In mathematics and statistics, a stationary process (or a strict/strictly stationary process or strong/strongly stationary process) is a stochastic process whose unconditional joint probability distribution does not change when shifted in time. Consequently, parameters such as mean and variance also do not change over time. If you draw a line through the middle of a stationary process then it should be flat; it may have 'seasonal' cycles, but overall it does not trend up nor down. Since stationarity is an assumption underlying many statistical procedures used in time series analysis, non-stationary data are often transformed to become stationary. The most common cause of violation of stationarity is a trend in the mean, which can be due either to the presence of a unit root or of a deterministic trend. In the former case of a unit root, stochastic shocks have permanent effects, and the process is not mean-reverting. In the latter case of a deterministic trend, the process is call ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Reliability Theory
Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period of time. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time. The reliability function is theoretically defined as the probability of success at time t, which is denoted R(t). This probability is estimated from detailed (physics of failure) analysis, previous data sets or through reliability testing and reliability modelling. Availability, testability, maintainability and maintenance are often defined as a part of "reliability engineering" in reliability programs. Reliability often plays the key role in the cost-effectiveness of systems. Reliability engineering deals with the prediction, prevention and manageme ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Extreme Value Theory
Extreme value theory or extreme value analysis (EVA) is a branch of statistics dealing with the extreme deviations from the median of probability distributions. It seeks to assess, from a given ordered sample of a given random variable, the probability of events that are more extreme than any previously observed. Extreme value analysis is widely used in many disciplines, such as structural engineering, finance, earth sciences, traffic prediction, and geological engineering. For example, EVA might be used in the field of hydrology to estimate the probability of an unusually large flooding event, such as the 100-year flood. Similarly, for the design of a breakwater, a coastal engineer would seek to estimate the 50-year wave and design the structure accordingly. Data analysis Two main approaches exist for practical extreme value analysis. The first method relies on deriving block maxima (minima) series as a preliminary step. In many situations it is customary and convenient to e ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]