Heckman correction
   HOME

TheInfoList



OR:

The Heckman correction is a statistical technique to correct
bias Bias is a disproportionate weight ''in favor of'' or ''against'' an idea or thing, usually in a way that is closed-minded, prejudicial, or unfair. Biases can be innate or learned. People may develop biases for or against an individual, a group ...
from non-randomly selected samples or otherwise incidentally truncated dependent variables, a pervasive issue in quantitative
social sciences Social science is one of the branches of science, devoted to the study of societies and the relationships among individuals within those societies. The term was formerly used to refer to the field of sociology, the original "science of so ...
when using observational data. Conceptually, this is achieved by explicitly modelling the individual sampling probability of each observation (the so-called selection equation) together with the
conditional expectation In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take “on average” over an arbitrarily large number of occurrences – give ...
of the dependent variable (the so-called outcome equation). The resulting
likelihood function The likelihood function (often simply called the likelihood) represents the probability of random variable realizations conditional on particular values of the statistical parameters. Thus, when evaluated on a given sample, the likelihood functi ...
is mathematically similar to the tobit model for censored dependent variables, a connection first drawn by James Heckman in 1974. Heckman also developed a two-step control function approach to estimate this model, which avoids the computational burden of having to estimate both equations jointly, albeit at the cost of inefficiency. Heckman received the
Nobel Memorial Prize in Economic Sciences The Nobel Memorial Prize in Economic Sciences, officially the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel ( sv, Sveriges riksbanks pris i ekonomisk vetenskap till Alfred Nobels minne), is an economics award administered ...
in 2000 for his work in this field.


Method

Statistical analyses based on non-randomly selected samples can lead to erroneous conclusions. The Heckman correction, a two-step statistical approach, offers a means of correcting for non-randomly selected samples. Heckman discussed bias from using nonrandom selected samples to estimate behavioral relationships as a specification error. He suggests a two-stage estimation method to correct the bias. The correction uses a control function idea and is easy to implement. Heckman's correction involves a normality assumption, provides a test for sample selection bias and formula for bias corrected model. Suppose that a researcher wants to estimate the determinants of wage offers, but has access to wage observations for only those who work. Since people who work are selected non-randomly from the population, estimating the determinants of wages from the subpopulation who work may introduce bias. The Heckman correction takes place in two stages. In the first stage, the researcher formulates a model, based on
economic theory Economics () is the social science that studies the production, distribution, and consumption of goods and services. Economics focuses on the behaviour and interactions of economic agents and how economies work. Microeconomics analyze ...
, for the probability of working. The canonical specification for this relationship is a probit regression of the form : \operatorname( D = 1 , Z ) = \Phi(Z\gamma), where ''D'' indicates employment (''D'' = 1 if the respondent is employed and ''D'' = 0 otherwise), ''Z'' is a vector of explanatory variables, \gamma is a vector of unknown parameters, and Φ is the
cumulative distribution function In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. Eve ...
of the standard
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
. Estimation of the model yields results that can be used to predict this employment probability for each individual. In the second stage, the researcher corrects for self-selection by incorporating a transformation of these predicted individual probabilities as an additional explanatory variable. The wage equation may be specified, : w^* = X\beta + u where w^* denotes an underlying wage offer, which is not observed if the respondent does not work. The conditional expectation of wages given the person works is then : E X, D=1 = X\beta + E X, D=1 Under the assumption that the error terms are jointly normal, we have : E X, D=1 = X\beta + \rho\sigma_u \lambda(Z\gamma), where ''ρ'' is the correlation between unobserved determinants of propensity to work \varepsilon and unobserved determinants of wage offers ''u'', ''σ'' ''u'' is the standard deviation of u , and \lambda is the inverse Mills ratio evaluated at Z\gamma . This equation demonstrates Heckman's insight that sample selection can be viewed as a form of omitted-variables bias, as conditional on both ''X'' and on \lambda it is as if the sample is randomly selected. The wage equation can be estimated by replacing \gamma with Probit estimates from the first stage, constructing the \lambda term, and including it as an additional explanatory variable in
linear regression In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is cal ...
estimation of the wage equation. Since \sigma_u > 0, the coefficient on \lambda can only be zero if \rho=0, so testing the null that the coefficient on \lambda is zero is equivalent to testing for sample selectivity. Heckman's achievements have generated a large number of empirical applications in economics as well as in other social sciences. The original method has subsequently been generalized, by Heckman and by others.


Statistical inference

The Heckman correction is a two-step M-estimator where the covariance matrix generated by OLS estimation of the second stage is inconsistent. Correct standard errors and other statistics can be generated from an asymptotic approximation or by resampling, such as through a bootstrap.


Disadvantages

* The two-step estimator discussed above is a limited information maximum likelihood (LIML) estimator. In asymptotic theory and in finite samples as demonstrated by Monte Carlo simulations, the full information (FIML) estimator exhibits better statistical properties. However, the FIML estimator is more computationally difficult to implement. * The canonical model assumes the errors are jointly normal. If that assumption fails, the estimator is generally inconsistent and can provide misleading inference in small samples. Semiparametric and other robust alternatives can be used in such cases. * The model obtains formal identification from the normality assumption when the same covariates appear in the selection equation and the equation of interest, but identification will be tenuous unless there are many observations in the tails where there is substantial nonlinearity in the inverse Mills ratio. Generally, an exclusion restriction is required to generate credible estimates: there must be at least one variable which appears with a non-zero coefficient in the selection equation but does not appear in the equation of interest, essentially an instrument. If no such variable is available, it may be difficult to correct for sampling selectivity.


Implementations in statistics packages

* R: Heckman-type procedures are available as part of the sampleSelection package. *
Stata Stata (, , alternatively , occasionally stylized as STATA) is a general-purpose statistical software package developed by StataCorp for data manipulation, visualization, statistics, and automated reporting. It is used by researchers in many fie ...
: the command heckman provides the Heckman selection model.


See also

* Propensity score matching *
Roy model The Roy model is one of the earliest works in economics on self-selection due to A. D. Roy. The basic model considers two types of workers that choose occupation in one of two sectors. Original model Roy's original paper deals with workers selecti ...


References


Further reading

* * * * *


External links


Nobel prize Heckman facts.
{{least squares and regression analysis Sampling (statistics) Regression analysis Econometric modeling