HOME

TheInfoList



OR:

The Berndt–Hall–Hall–Hausman (BHHH) algorithm is a
numerical optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
algorithm In mathematics and computer science, an algorithm () is a finite sequence of Rigour#Mathematics, mathematically rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algo ...
similar to the Newton–Raphson algorithm, but it replaces the observed negative
Hessian matrix In mathematics, the Hessian matrix, Hessian or (less commonly) Hesse matrix is a square matrix of second-order partial derivatives of a scalar-valued Function (mathematics), function, or scalar field. It describes the local curvature of a functio ...
with the
outer product In linear algebra, the outer product of two coordinate vectors is the matrix whose entries are all products of an element in the first vector with an element in the second vector. If the two coordinate vectors have dimensions ''n'' and ''m'', the ...
of the
gradient In vector calculus, the gradient of a scalar-valued differentiable function f of several variables is the vector field (or vector-valued function) \nabla f whose value at a point p gives the direction and the rate of fastest increase. The g ...
. This approximation is based on the information matrix equality and therefore only valid while maximizing a
likelihood function A likelihood function (often simply called the likelihood) measures how well a statistical model explains observed data by calculating the probability of seeing that data under different parameter values of the model. It is constructed from the ...
. The BHHH algorithm is named after the four originators: Ernst R. Berndt, Bronwyn Hall, Robert Hall, and Jerry Hausman.


Usage

If a
nonlinear In mathematics and science, a nonlinear system (or a non-linear system) is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathe ...
model is fitted to the
data Data ( , ) are a collection of discrete or continuous values that convey information, describing the quantity, quality, fact, statistics, other basic units of meaning, or simply sequences of symbols that may be further interpreted for ...
one often needs to estimate
coefficient In mathematics, a coefficient is a Factor (arithmetic), multiplicative factor involved in some Summand, term of a polynomial, a series (mathematics), series, or any other type of expression (mathematics), expression. It may be a Dimensionless qu ...
s through
optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
. A number of optimization algorithms have the following general structure. Suppose that the function to be optimized is ''Q''(''β''). Then the algorithms are iterative, defining a sequence of approximations, ''βk'' given by :\beta_=\beta_-\lambda_A_\frac(\beta_),, where \beta_ is the parameter estimate at step k, and \lambda_ is a parameter (called step size) which partly determines the particular algorithm. For the BHHH algorithm ''λk'' is determined by calculations within a given iterative step, involving a line-search until a point ''β''''k''+1 is found satisfying certain criteria. In addition, for the BHHH algorithm, ''Q'' has the form :Q = \sum_^ Q_i and ''A'' is calculated using :A_=\left sum_^\frac(\beta_)\frac(\beta_)'\right . In other cases, e.g.
Newton–Raphson In numerical analysis, the Newton–Raphson method, also known simply as Newton's method, named after Isaac Newton and Joseph Raphson, is a root-finding algorithm which produces successively better approximations to the roots (or zeroes) of a ...
, A_ can have other forms. The BHHH algorithm has the advantage that, if certain conditions apply, convergence of the iterative procedure is guaranteed.


See also

* Davidon–Fletcher–Powell (DFP) algorithm * Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm


References


Further reading

*V. Martin, S. Hurn, and D. Harris, ''Econometric Modelling with Time Series'', Chapter 3 'Numerical Estimation Methods'. Cambridge University Press, 2015. * * * * {{DEFAULTSORT:Bhhh Algorithm Estimation methods Optimization algorithms and methods