Symmetric Mean Absolute Percentage Error
   HOME

TheInfoList



OR:

Symmetric mean absolute percentage error (SMAPE or sMAPE) is an accuracy measure based on percentage (or relative) errors. It is usually defined as follows: : \text = \frac \sum_^n \frac where ''A''''t'' is the actual value and ''F''''t'' is the forecast value. The
absolute difference The absolute difference of two real numbers x and y is given by , x-y, , the absolute value of their difference. It describes the distance on the real line between the points corresponding to x and y. It is a special case of the Lp distance for ...
between ''A''''t'' and ''F''''t'' is divided by half the sum of absolute values of the actual value ''A''''t'' and the forecast value ''F''''t''. The value of this calculation is summed for every fitted point ''t'' and divided again by the number of fitted points ''n''. The earliest reference to similar formula appears to be Armstrong (1985, p. 348) where it is called "adjusted MAPE" and is defined without the absolute values in denominator. It has been later discussed, modified and re-proposed by Flores (1986). Armstrong's original definition is as follows: : \text = \frac 1 n \sum_^n \frac The problem is that it can be negative (if A_t + F_t < 0) or even undefined (if A_t + F_t = 0). Therefore the currently accepted version of SMAPE assumes the absolute values in the denominator. In contrast to the mean absolute percentage error, SMAPE has both a lower bound and an upper bound. Indeed, the formula above provides a result between 0% and 200%. However a percentage error between 0% and 100% is much easier to interpret. That is the reason why the formula below is often used in practice (i.e. no factor 0.5 in denominator): : \text = \frac \sum_^n \frac In the above formula, if A_t = F_t = 0, then the t'th term in the summation is 0, since the percent error between the two is clearly 0 and the value of \frac is undefined. One supposed problem with SMAPE is that it is not symmetric since over- and under-forecasts are not treated equally. This is illustrated by the following example by applying the second SMAPE formula: * Over-forecasting: ''A''''t'' = 100 and ''F''''t'' = 110 give SMAPE = 4.76% * Under-forecasting: ''A''''t'' = 100 and ''F''''t'' = 90 give SMAPE = 5.26%. However, one should only expect this type of symmetry for measures which are entirely difference-based and not relative (such as mean squared error and mean absolute deviation). There is a third version of SMAPE, which allows to measure the direction of the bias in the data by generating a positive and a negative error on line item level. Furthermore it is better protected against outliers and the bias effect mentioned in the previous paragraph than the two other formulas. The formula is: : \text = \frac A limitation to SMAPE is that if the actual value or forecast value is 0, the value of error will boom up to the upper-limit of error. (200% for the first formula and 100% for the second formula). Provided the data are strictly positive, a better measure of relative accuracy can be obtained based on the log of the accuracy ratio: log(''F''''t'' / ''A''''t'') This measure is easier to analyse statistically, and has valuable symmetry and unbiasedness properties. When used in constructing forecasting models the resulting prediction corresponds to the geometric mean (Tofallis, 2015).


See also

*
Relative change and difference In any quantitative science, the terms relative change and relative difference are used to compare two quantities while taking into account the "sizes" of the things being compared, i.e. dividing by a ''standard'' or ''reference'' or ''starting'' v ...
*
Mean absolute error In statistics, mean absolute error (MAE) is a measure of errors between paired observations expressing the same phenomenon. Examples of ''Y'' versus ''X'' include comparisons of predicted versus observed, subsequent time versus initial time, and ...
* Mean absolute percentage error *
Mean squared error In statistics, the mean squared error (MSE) or mean squared deviation (MSD) of an estimator (of a procedure for estimating an unobserved quantity) measures the average of the squares of the errors—that is, the average squared difference between ...
*
Root mean squared error The root-mean-square deviation (RMSD) or root-mean-square error (RMSE) is a frequently used measure of the differences between values (sample or population values) predicted by a model or an estimator and the values observed. The RMSD represents ...


References

* Armstrong, J. S. (1985) Long-range Forecasting: From Crystal Ball to Computer, 2nd. ed. Wiley. * Flores, B. E. (1986) "A pragmatic view of accuracy measurement in forecasting", Omega (Oxford), 14(2), 93–98. * Tofallis, C (2015) "A Better Measure of Relative Prediction Accuracy for Model Selection and Model Estimation", Journal of the Operational Research Society, 66(8),1352-1362
archived preprint


External links


Rob J. Hyndman: Errors on Percentage Errors
{{Machine learning evaluation metrics Statistical deviation and dispersion