HOME

TheInfoList



OR:

Moving horizon estimation (MHE) is an
optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
approach that uses a series of measurements observed over time, containing
noise Noise is unwanted sound considered unpleasant, loud or disruptive to hearing. From a physics standpoint, there is no distinction between noise and desired sound, as both are vibrations through a medium, such as air or water. The difference arise ...
(random variations) and other inaccuracies, and produces estimates of unknown variables or parameters. Unlike deterministic approaches, MHE requires an iterative approach that relies on
linear programming Linear programming (LP), also called linear optimization, is a method to achieve the best outcome (such as maximum profit or lowest cost) in a mathematical model whose requirements are represented by linear function#As a polynomial function, li ...
or
nonlinear programming In mathematics, nonlinear programming (NLP) is the process of solving an optimization problem where some of the constraints or the objective function are nonlinear. An optimization problem is one of calculation of the extrema (maxima, minima or sta ...
solvers to find a solution. MHE reduces to the
Kalman filter For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estimat ...
under certain simplifying conditions. A critical evaluation of the
extended Kalman filter In estimation theory, the extended Kalman filter (EKF) is the nonlinear version of the Kalman filter which linearizes about an estimate of the current mean and covariance. In the case of well defined transition models, the EKF has been considered t ...
and the MHE found that the MHE improved performance at the cost of increased computational expense. Because of the computational expense, MHE has generally been applied to systems where there are greater computational resources and moderate to slow system dynamics. However, in the literature there are some methods to accelerate this method.


Overview

The application of MHE is generally to estimate measured or unmeasured states of
dynamical system In mathematics, a dynamical system is a system in which a Function (mathematics), function describes the time dependence of a Point (geometry), point in an ambient space. Examples include the mathematical models that describe the swinging of a ...
s. Initial conditions and parameters within a model are adjusted by MHE to align measured and predicted values. MHE is based on a finite horizon optimization of a process model and measurements. At time the current process state is sampled and a minimizing strategy is computed (via a numerical minimization algorithm) for a relatively short time horizon in the past: -T,t/math>. Specifically, an online or on-the-fly calculation is used to explore state trajectories that find (via the solution of
Euler–Lagrange equation In the calculus of variations and classical mechanics, the Euler–Lagrange equations are a system of second-order ordinary differential equations whose solutions are stationary points of the given action functional. The equations were discovered ...
s) an objective-minimizing strategy until time t. Only the last step of the estimation strategy is used, then the process state is sampled again and the calculations are repeated starting from the time-shifted states, yielding a new state path and predicted parameters. The estimation horizon keeps being shifted forward and for this reason the technique is called moving horizon estimation. Although this approach is not optimal, in practice it has given very good results when compared with the
Kalman filter For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estimat ...
and other estimation strategies.


Principles of MHE

Moving horizon estimation (MHE) is a multivariable estimation algorithm that uses: * an internal dynamic model of the process * a history of past measurements and * an optimization cost function J over the estimation horizon, to calculate the optimum states and parameters. The optimization estimation function is given by: J=\sum_^N w_ (x_i-y_i)^2 + \sum_^N w_ (x_i-\hat_i)^2 + \sum_^N w_ ^2 without violating state or parameter constraints (low/high limits) With: x_i = ''i'' -th model predicted variable (e.g. predicted temperature) y_i = ''i'' -th measured variable (e.g. measured temperature) p_i = ''i'' -th estimated parameter (e.g. heat transfer coefficient) w_ = weighting coefficient reflecting the relative importance of measured values y_i w_ = weighting coefficient reflecting the relative importance of prior model predictions \hat_i w_ = weighting coefficient penalizing relative big changes in p_i Moving horizon estimation uses a sliding time window. At each sampling time the window moves one step forward. It estimates the states in the window by analyzing the measured output sequence and uses the last estimated state out of the window, as the prior knowledge.


Applications

* MATLAB, Python, and Simulink source code for MHE
Python, MATLAB, and Simulink CSTR Example
* Monitoring of industrial process fouling * Oil and gas industry * Polymer manufacture * Unmanned aerial systems


See also

*
Alpha beta filter An alpha beta filter (also called alpha-beta filter, f-g filter or g-h filterEli Brookner: Tracking and Kalman Filtering Made Easy. Wiley-Interscience, 1st edition, 4 1998.) is a simplified form of observer for estimation, data smoothing and contro ...
*
Data assimilation Data assimilation is a mathematical discipline that seeks to optimally combine theory (usually in the form of a numerical model) with observations. There may be a number of different goals sought – for example, to determine the optimal state es ...
*
Ensemble Kalman filter The ensemble Kalman filter (EnKF) is a recursive filter suitable for problems with a large number of variables, such as discretizations of partial differential equations in geophysical models. The EnKF originated as a version of the Kalman filter fo ...
*
Extended Kalman filter In estimation theory, the extended Kalman filter (EKF) is the nonlinear version of the Kalman filter which linearizes about an estimate of the current mean and covariance. In the case of well defined transition models, the EKF has been considered t ...
*
Invariant extended Kalman filter The invariant extended Kalman filter (IEKF) (not to be confused with the iterated extended Kalman filter) was first introduced as a version of the extended Kalman filter (EKF) for nonlinear systems possessing symmetries (or ''invariances''), then ge ...
*
Fast Kalman filter The fast Kalman filter (FKF), devised by Antti Lange (born 1941), is an extension of the Helmert–Wolf blocking (HWB) method from geodesy to safety-critical real-time applications of Kalman filtering (KF) such as GNSS navigation up to the centim ...
*
Filtering problem (stochastic processes) In the theory of stochastic processes, filtering describes the problem of determining the state of a system from an incomplete and potentially noisy set of observations. While originally motivated by problems in engineering, filtering found applic ...
*
Kernel adaptive filter In signal processing, a kernel adaptive filter is a type of nonlinear adaptive filter. An adaptive filter is a filter that adapts its transfer function to changes in signal properties over time by minimizing an error or loss function that characte ...
*
Non-linear filter In signal processing, a nonlinear (or non-linear) filter is a filter whose output is not a linear function of its input. That is, if the filter outputs signals ''R'' and ''S'' for two input signals ''r'' and ''s'' separately, but does not always o ...
*
Particle filter Particle filters, or sequential Monte Carlo methods, are a set of Monte Carlo algorithms used to solve filtering problems arising in signal processing and Bayesian statistical inference. The filtering problem consists of estimating the inte ...
*
Predictor corrector Predictor may refer to: * Branch predictor, a part of many modern processors * Kerrison Predictor, a military fire-control computer * Predictor variable, also known as an independent variable * A type of railway level crossing, circuit that trie ...
*
Recursive least squares Recursive least squares (RLS) is an adaptive filter algorithm that recursively finds the coefficients that minimize a weighted linear least squares cost function relating to the input signals. This approach is in contrast to other algorithms such ...
*
Schmidt–Kalman filter The Schmidt–Kalman Filter is a modification of the Kalman filter for reducing the dimensionality of the state estimate, while still considering the effects of the additional state in the calculation of the covariance matrix and the Kalman gains. ...
*
Sliding mode control In control systems, sliding mode control (SMC) is a nonlinear control method that alters the dynamics of a nonlinear system by applying a discontinuous control signal (or more rigorously, a set-valued control signal) that forces the system to "sl ...
*
Wiener filter In signal processing, the Wiener filter is a filter used to produce an estimate of a desired or target random process by linear time-invariant ( LTI) filtering of an observed noisy process, assuming known stationary signal and noise spectra, and ...


References


Further reading

* {{Refend


External links


MHE
wit


MHE Tutorial in Simulink and MATLAB

MHE lecture material

Online Course:
MHE in Simulink, MATLAB and Python Control theory Nonlinear filters Linear filters Signal estimation