HOME

TheInfoList



OR:

Least trimmed squares (LTS), or least trimmed sum of squares, is a robust statistical method that fits a function to a set of data whilst not being unduly affected by the presence of
outlier In statistics, an outlier is a data point that differs significantly from other observations. An outlier may be due to a variability in the measurement, an indication of novel data, or it may be the result of experimental error; the latter are ...
s. It is one of a number of methods for
robust regression In robust statistics, robust regression seeks to overcome some limitations of traditional regression analysis. A regression analysis models the relationship between one or more independent variables and a dependent variable. Standard types of reg ...
.


Description of method

Instead of the standard
least squares The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the res ...
method, which minimises the sum of squared residuals over ''n'' points, the LTS method attempts to minimise the sum of squared residuals over a subset, k, of those points. The unused n - k points do not influence the fit. In a standard least squares problem, the estimated parameter values β are defined to be those values that minimise the objective function ''S''(β) of squared residuals: :S = \sum_^n r_i(\beta)^2, where the residuals are defined as the differences between the values of the
dependent variables Dependent and independent variables are variables in mathematical modeling, statistical modeling and experimental sciences. Dependent variables receive this name because, in an experiment, their values are studied under the supposition or demand ...
(observations) and the model values: :r_i(\beta) = y_i - f(x_i, \beta), and where ''n'' is the overall number of data points. For a least trimmed squares analysis, this objective function is replaced by one constructed in the following way. For a fixed value of β, let r_(\beta) denote the set of ordered absolute values of the residuals (in increasing order of absolute value). In this notation, the standard sum of squares function is :S(\beta) = \sum_^n r_(\beta)^2, while the objective function for LTS is :S_k(\beta) = \sum_^k r_(\beta)^2.


Computational considerations

Because this method is binary, in that points are either included or excluded, no closed-form solution exists. As a result, methods for finding the LTS solution sift through combinations of the data, attempting to find the ''k'' subset that yields the lowest sum of squared residuals. Methods exist for low ''n'' that will find the exact solution; however, as ''n'' rises, the number of combinations grows rapidly, thus yielding methods that attempt to find approximate (but generally sufficient) solutions.


References

* * * * * {{cite journal , last=Jung , first=Kang-Mo , year=2007 , title=Least Trimmed Squares Estimator in the Errors-in-Variables Model , journal=
Journal of Applied Statistics The ''Journal of Applied Statistics'' or ''J.Appl.Stat.'' is a peer-reviewed scientific journal covering applied statistics that is published by Taylor & Francis. Its ''Journal Citation Reports'' impact factor was 1.013 in 2019. Creation The jour ...
, volume=34 , issue=3 , pages=331–338 , doi=10.1080/02664760601004973 Robust statistics Robust regression