HOME

TheInfoList



OR:

In statistics, quasi-likelihood methods are used to estimate parameters in a statistical model when exact likelihood methods, for example
maximum likelihood In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed stat ...
estimation, are computationally infeasible. Due to the wrong likelihood being used, quasi-likelihood estimators lose asymptotic efficiency compared to, e.g., maximum likelihood estimators. Under broadly applicable conditions, quasi-likelihood estimators are consistent and asymptotically normal. The asymptotic covariance matrix can be obtained using the so-called sandwich estimator. Examples of quasi-likelihood methods are the generalized estimating equations and pairwise likelihood approaches.


History

The term quasi-likelihood function was introduced by Robert Wedderburn in 1974 to describe a function that has similar properties to the log-
likelihood function The likelihood function (often simply called the likelihood) represents the probability of random variable realizations conditional on particular values of the statistical parameters. Thus, when evaluated on a given sample, the likelihood funct ...
but is not the log-likelihood corresponding to any actual probability distribution. He proposed to fit certain quasi-likelihood models using a straightforward extension of the algorithms used to fit generalized linear models.


Application to overdispersion modelling

Quasi-likelihood estimation is one way of allowing for
overdispersion In statistics, overdispersion is the presence of greater variability (statistical dispersion) in a data set than would be expected based on a given statistical model. A common task in applied statistics is choosing a parametric model to fit a ...
, that is, greater variability in the data than would be expected from the statistical model used. It is most often used with models for count data or grouped binary data, i.e. data that would otherwise be modelled using the Poisson or binomial distribution. Instead of specifying a probability distribution for the data, only a relationship between the mean and the variance is specified in the form of a
variance function In statistics, the variance function is a smooth function which depicts the variance of a random quantity as a function of its mean. The variance function is a measure of heteroscedasticity and plays a large role in many settings of statisti ...
giving the variance as a function of the mean. Generally, this function is allowed to include a multiplicative factor known as the overdispersion parameter or scale parameter that is estimated from the data. Most commonly, the variance function is of a form such that fixing the overdispersion parameter at unity results in the variance-mean relationship of an actual probability distribution such as the binomial or Poisson. (For formulae, see the binomial data example and count data example under generalized linear models.)


Comparison to alternatives

Random-effects models, and more generally
mixed model A mixed model, mixed-effects model or mixed error-component model is a statistical model containing both fixed effects and random effects. These models are useful in a wide variety of disciplines in the physical, biological and social sciences. ...
s ( hierarchical models) provide an alternative method of fitting data exhibiting overdispersion using fully specified probability models. However, these methods often become complex and computationally intensive to fit to binary or count data. Quasi-likelihood methods have the advantage of relative computational simplicity, speed and robustness, as they can make use of the more straightforward algorithms developed to fit generalized linear models.


See also

* Quasi-maximum likelihood estimate *
Extremum estimator In statistics and econometrics, extremum estimators are a wide class of estimators for parametric models that are calculated through maximization (or minimization) of a certain objective function, which depends on the data. The general theory of ...


Notes


References

* * {{DEFAULTSORT:Quasi-Likelihood Likelihood Maximum likelihood estimation