HOME

TheInfoList



OR:

In
compressed sensing Compressed sensing (also known as compressive sensing, compressive sampling, or sparse sampling) is a signal processing technique for efficiently acquiring and reconstructing a Signal (electronics), signal, by finding solutions to Underdetermined ...
, the
nullspace In mathematics, the kernel of a linear map, also known as the null space or nullspace, is the linear subspace of the domain of the map which is mapped to the zero vector. That is, given a linear map between two vector spaces and , the kernel of ...
property gives necessary and sufficient conditions on the reconstruction of sparse signals using the techniques of \ell_1-relaxation. The term "nullspace property" originates from Cohen, Dahmen, and DeVore. The nullspace property is often difficult to check in practice, and the
restricted isometry property In linear algebra, the restricted isometry property (RIP) characterizes matrices which are nearly orthonormal, at least when operating on sparse vectors. The concept was introduced by Emmanuel Candès and Terence TaoE. J. Candes and T. Tao, "Decodin ...
is a more modern condition in the field of compressed sensing.


The technique of \ell_1-relaxation

The non-convex \ell_0-minimization problem, \min\limits_ \, x\, _0 subject to Ax = b, is a standard problem in compressed sensing. However, \ell_0-minimization is known to be
NP-hard In computational complexity theory, NP-hardness ( non-deterministic polynomial-time hardness) is the defining property of a class of problems that are informally "at least as hard as the hardest problems in NP". A simple example of an NP-hard pr ...
in general. As such, the technique of \ell_1-relaxation is sometimes employed to circumvent the difficulties of signal reconstruction using the \ell_0-norm. In \ell_1-relaxation, the \ell_1 problem, \min\limits_ \, x\, _1 subject to Ax = b, is solved in place of the \ell_0 problem. Note that this relaxation is convex and hence amenable to the standard techniques of
linear programming Linear programming (LP), also called linear optimization, is a method to achieve the best outcome (such as maximum profit or lowest cost) in a mathematical model whose requirements are represented by linear function#As a polynomial function, li ...
- a computationally desirable feature. Naturally we wish to know when \ell_1-relaxation will give the same answer as the \ell_0 problem. The nullspace property is one way to guarantee agreement.


Definition

An m \times n complex matrix A has the nullspace property of order s, if for all index sets S with s=, S, \leq n we have that: \, \eta_S\, _1 < \, \eta_\, _1 for all \eta \in \ker \setminus \left\.


Recovery Condition

The following theorem gives necessary and sufficient condition on the recoverability of a given s-sparse vector in \mathbb^n. The proof of the theorem is a standard one, and the proof supplied here is summarized from Holger Rauhut. \textbf Let A be a m \times n complex matrix. Then every s-sparse signal x \in \mathbb^n is the unique solution to the \ell_1-relaxation problem with b = Ax if and only if A satisfies the nullspace property with order s. \textit For the forwards direction notice that \eta_S and -\eta_ are distinct vectors with A(-\eta_) = A(\eta_S) by the linearity of A, and hence by uniqueness we must have \, \eta_S\, _1 < \, \eta_\, _1 as desired. For the backwards direction, let x be s-sparse and z another (not necessary s-sparse) vector such that z \neq x and Az = Ax. Define the (non-zero) vector \eta = x - z and notice that it lies in the nullspace of A. Call S the support of x, and then the result follows from an elementary application of the
triangle inequality In mathematics, the triangle inequality states that for any triangle, the sum of the lengths of any two sides must be greater than or equal to the length of the remaining side. This statement permits the inclusion of degenerate triangles, but ...
: \, x\, _1 \leq \, x - z_S\, _1 + \, z_S\, _1 = \, \eta_S\, _1+\, z_S\, _1 < \, \eta_\, _1+\, z_S\, _1 = \, -z_\, _1+\, z_S\, _1 = \, z\, _1, establishing the minimality of x. \square


References

{{Reflist Linear algebra