HOME

TheInfoList



OR:

In
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, the concept of a concomitant, also called the induced order statistic, arises when one sorts the members of a random sample according to corresponding values of another random sample. Let (''X''''i'', ''Y''''i''), ''i'' = 1, . . ., ''n'' be a random sample from a bivariate distribution. If the sample is ordered by the ''X''''i'', then the ''Y''-variate associated with ''X''''r'':''n'' will be denoted by ''Y'' 'r'':''n''/sub> and termed the concomitant of the ''r''th
order statistic In statistics, the ''k''th order statistic of a statistical sample is equal to its ''k''th-smallest value. Together with rank statistics, order statistics are among the most fundamental tools in non-parametric statistics and inference. Import ...
. Suppose the parent bivariate distribution having the
cumulative distribution function In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. Ev ...
''F(x,y)'' and its
probability density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can ...
''f(x,y)'', then the
probability density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can ...
of ''r''''th'' concomitant Y_ for 1 \le r \le n is f_(y) = \int_^\infty f_(y, x) f_ (x) \, \mathrm x If all (X_i, Y_i) are assumed to be i.i.d., then for 1 \le r_1 < \cdots < r_k \le n, the joint density for \left(Y_, \cdots, Y_ \right) is given by f_(y_1, \cdots, y_k) = \int_^\infty \int_^ \cdots \int_^ \prod^k_ f_ (y_i, x_i) f_(x_1,\cdots,x_k)\mathrmx_1\cdots \mathrmx_k That is, in general, the joint concomitants of order statistics \left(Y_, \cdots, Y_ \right) is dependent, but are conditionally independent given X_ = x_1, \cdots, X_ = x_k for all ''k'' where x_1 \le \cdots \le x_k. The conditional distribution of the joint concomitants can be derived from the above result by comparing the formula in
marginal distribution In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. It gives the probabilities of various values of the variables ...
and hence f_(y_1, \cdots, y_k , x_1, \cdots, x_k) = \prod^k_ f_ (y_i, x_i)


References

* * * {{cite book , title = Special Functions for Applied Scientists , first1 = A. M. , last1 = Mathai , first2 = Hans J. , last2 = Haubold , publisher = Springer , year = 2008 , isbn = 978-0-387-75893-0 Theory of probability distributions