Robust Principal Component Analysis (RPCA) is a modification of the widely used statistical procedure of
principal component analysis (PCA) which works well with respect to ''grossly'' corrupted observations. A number of different approaches exist for Robust PCA, including an idealized version of Robust PCA, which aims to recover a low-rank matrix L
0 from highly corrupted measurements M = L
0 +S
0.
[ This decomposition in low-rank and sparse matrices can be achieved by techniques such as Principal Component Pursuit method (PCP),] Stable PCP, Quantized PCP, Block based PCP, and Local PCP. Then, optimization methods are used such as the Augmented Lagrange Multiplier Method (ALM), Alternating Direction Method (ADM), Fast Alternating Minimization (FAM), Iteratively Reweighted Least Squares (IRLS )
or alternating projections (AP).
Algorithms
Non-convex method
The 2014 guaranteed algorithm
In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing ...
for the robust PCA problem (with the input matrix being ) is an alternating minimization type algorithm. The computational complexity is where the input is the superposition of a low-rank (of rank ) and a sparse matrix
In numerical analysis and scientific computing, a sparse matrix or sparse array is a matrix in which most of the elements are zero. There is no strict definition regarding the proportion of zero-value elements for a matrix to qualify as sparse b ...
of dimension and is the desired accuracy of the recovered solution, i.e., where is the true low-rank component and is the estimated or recovered low-rank component. Intuitively, this algorithm performs projections of the residual on to the set of low-rank matrices (via the SVD
''Svenska Dagbladet'' (, "The Swedish Daily News"), abbreviated SvD, is a daily newspaper published in Stockholm, Sweden.
History and profile
The first issue of ''Svenska Dagbladet'' appeared on 18 December 1884. During the beginning of the ...
operation) and sparse matrices (via entry-wise hard thresholding) in an alternating manner - that is, low-rank projection of the difference the input matrix and the sparse matrix obtained at a given iteration followed by sparse projection of the difference of the input matrix and the low-rank matrix obtained in the previous step, and iterating the two steps until convergence
Convergence may refer to:
Arts and media Literature
*''Convergence'' (book series), edited by Ruth Nanda Anshen
*Convergence (comics), "Convergence" (comics), two separate story lines published by DC Comics:
**A four-part crossover storyline that ...
.
This alternating projections algorithm is later improved by an accelerated version, coined AccAltProj. The acceleration is achieved by applying a tangent space projection before project the residue onto the set of low-rank matrices. This trick improves the computational complexity to with a much smaller constant in front while it maintains the theoretically guaranteed linear convergence.
Another fast version of accelerated alternating projections algorithm is IRCUR. It uses the structure of CUR decomposition in alternating projections framework to dramatically reduces the compuational complexity of RPCA to
Convex relaxation
This method consists of relaxing the rank constraint in the optimization problem to the nuclear norm and the sparsity constraint to -norm . The resulting program can be solved using methods such as the method of Augmented Lagrange Multipliers.
Deep-learning augmented method
Some recent works propose RPCA algorithms with learnable/training parameters. Such a learnable/trainable algorithm can be unfolded as a deep neural network whose parameters can be learned via machine learning techniques from a given dataset or problem distribution. The learned algorithm will have superior performance on the corresponding problem distribution.
Applications
RPCA has many real life important applications particularly when the data under study can naturally be modeled as a low-rank plus a sparse contribution. Following examples are inspired by contemporary challenges in computer science
Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to practical disciplines (includi ...
, and depending on the applications, either the low-rank component or the sparse component could be the object of interest:
Video surveillance
Given a sequence of surveillance video frames, it is often required to identify the activities that stand out from the background. If we stack the video frames as columns of a matrix M, then the low-rank component L0 naturally corresponds to the stationary background and the sparse component S0 captures the moving objects in the foreground.
Face recognition
Images of a convex, Lambertian surface under varying illuminations span a low-dimensional subspace. This is one of the reasons for effectiveness of low-dimensional models for imagery data. In particular, it is easy to approximate images of a human's face by a low-dimensional subspace. To be able to correctly retrieve this subspace is crucial in many applications such as face recognition and alignment. It turns out that RPCA can be applied successfully to this problem to exactly recover the face.[
]
Surveys
*Robust PCA
*Dynamic RPCA
*Decomposition into Low-rank plus Additive Matrices
*Low-rank models
Books, journals and workshops
Books
*T. Bouwmans, N. Aybat, and E. Zahzah. ''Handbook on Robust Low-Rank and Sparse Matrix Decomposition: Applications in Image and Video Processing'', CRC Press, Taylor and Francis Group, May 2016. (more information: http://www.crcpress.com/product/isbn/9781498724623)
*Z. Lin, H. Zhang, "Low-Rank Models in Visual Analysis: Theories, Algorithms, and Applications", Academic Press, Elsevier, June 2017. (more information: https://www.elsevier.com/books/low-rank-models-in-visual-analysis/lin/978-0-12-812731-5)
Journals
* N. Vaswani, Y. Chi, T. Bouwmans, Special Issue on
Rethinking PCA for Modern Datasets: Theory, Algorithms, and Applications
, Proceedings of the IEEE, 2018.
*T. Bouwmans, N. Vaswani, P. Rodriguez, R. Vidal, Z. Lin, Special Issue on
Robust Subspace Learning and Tracking: Theory, Algorithms, and Applications
, IEEE Journal of Selected Topics in Signal Processing, December 2018.
Workshops
*RSL-CV 2015: Workshop on Robust Subspace Learning and Computer Vision in conjunction with ICCV 2015 (For more information: http://rsl-cv2015.univ-lr.fr/workshop/)
*RSL-CV 2017: Workshop on Robust Subspace Learning and Computer Vision in conjunction with ICCV 2017 (For more information: http://rsl-cv.univ-lr.fr/2017/)
*RSL-CV 2021: Workshop on Robust Subspace Learning and Computer Vision in conjunction with ICCV 2021 (For more information: https://rsl-cv.univ-lr.fr/2021/)
Sessions
* Special Session on "Online Algorithms for Static and Dynamic Robust PCA and Compressive Sensing" in conjunction with SSP 2018. (More information: https://ssp2018.org/)
Resources and libraries
Websites
Background Subtraction Website
DLAM Website
Libraries
Th
LRS Library
(developed b
Andrews Sobral
provides a collection of low-rank and sparse decomposition algorithms in MATLAB. The library was designed for moving object detection Moving object detection is a technique used in computer vision and image processing. Multiple consecutive frames from a video are compared by various methods to determine if any moving object is detected.
Moving objects detection has been used for ...
in videos, but it can be also used for other computer vision / machine learning tasks. Currently the LRSLibrary offers more than 100 algorithms based on ''matrix'' and ''tensor'' methods.
References
{{Reflist
External links
LRSLibrary
Matrix decompositions
Dimension reduction
Robust statistics