Estimating equations
   HOME

TheInfoList



OR:

In statistics, the method of estimating equations is a way of specifying how the parameters of a statistical model should be
estimated Estimation (or estimating) is the process of finding an estimate or approximation, which is a value that is usable for some purpose even if input data may be incomplete, uncertain, or unstable. The value is nonetheless usable because it is der ...
. This can be thought of as a generalisation of many classical methods—the method of moments, least squares, and
maximum likelihood In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed stat ...
—as well as some recent methods like
M-estimator In statistics, M-estimators are a broad class of extremum estimators for which the objective function is a sample average. Both non-linear least squares and maximum likelihood estimation are special cases of M-estimators. The definition of M-estim ...
s. The basis of the method is to have, or to find, a set of simultaneous equations involving both the sample data and the unknown model parameters which are to be solved in order to define the estimates of the parameters. Various components of the equations are defined in terms of the set of observed data on which the estimates are to be based. Important examples of estimating equations are the likelihood equations.


Examples

Consider the problem of estimating the rate parameter, λ of the exponential distribution which has the
probability density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) ca ...
: : f(x;\lambda) = \left\{\begin{matrix} \lambda e^{-\lambda x}, &\; x \ge 0, \\ 0, &\; x < 0. \end{matrix}\right. Suppose that a sample of data is available from which either the
sample mean The sample mean (or "empirical mean") and the sample covariance are statistics computed from a sample of data on one or more random variables. The sample mean is the average value (or mean value) of a sample of numbers taken from a larger popu ...
, \bar{x}, or the sample median, ''m'', can be calculated. Then an estimating equation based on the mean is :\bar{x}=\lambda^{-1}, while the estimating equation based on the median is :m=\lambda^{-1} \ln 2 . Each of these equations is derived by equating a sample value (sample statistic) to a theoretical (population) value. In each case the sample statistic is a
consistent estimator In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a rule for computing estimates of a parameter ''θ''0—having the property that as the number of data points used increases indefinitely, the result ...
of the population value, and this provides an intuitive justification for this type of approach to estimation.


See also

*
Generalized estimating equation In statistics, a generalized estimating equation (GEE) is used to estimate the parameters of a generalized linear model with a possible unmeasured correlation between observations from different timepoints. Although some believe that Generalized es ...
s *
Method of moments (statistics) In statistics, the method of moments is a method of estimation of population parameters. The same principle is used to derive higher moments like skewness and kurtosis. It starts by expressing the population moments (i.e., the expected values ...
* Generalized method of moments *
Maximum likelihood In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed stat ...


References

* * * * {{Statistics, inference, collapsed Estimation methods