Matrices
Let be a Hermitian matrix. As with many other variational results on eigenvalues, one considers the Rayleigh–Ritz quotient defined by : where denotes the Euclidean inner product on . Clearly, the Rayleigh quotient of an eigenvector is its associated eigenvalue. Equivalently, the Rayleigh–Ritz quotient can be replaced by : For Hermitian matrices ''A'', the range of the continuous function ''RA''(''x''), or ''f''(''x''), is a compact subset 'a'', ''b''of the real line. The maximum ''b'' and the minimum ''a'' are the largest and smallest eigenvalue of ''A'', respectively. The min-max theorem is a refinement of this fact.Min-max theorem
Let be an Hermitian matrix with eigenvalues then : and : in particular, : and these bounds are attained when is an eigenvector of the appropriate eigenvalues. Also the simpler formulation for the maximal eigenvalue ''λ''n is given by: : Similarly, the minimal eigenvalue ''λ''1 is given by: :Counterexample in the non-Hermitian case
Let ''N'' be the nilpotent matrix : Define the Rayleigh quotient exactly as above in the Hermitian case. Then it is easy to see that the only eigenvalue of ''N'' is zero, while the maximum value of the Rayleigh quotient is . That is, the maximum value of the Rayleigh quotient is larger than the maximum eigenvalue.Applications
Min-max principle for singular values
The singular values of a square matrix ''M'' are the square roots of the eigenvalues of ''M''*''M'' (equivalently ''MM*''). An immediate consequence of the first equality in the min-max theorem is: : Similarly, : Here denotes the ''k''th entry in the increasing sequence of σ's, so that .Cauchy interlacing theorem
Let be a symmetric ''n'' × ''n'' matrix. The ''m'' × ''m'' matrix ''B'', where ''m'' ≤ ''n'', is called a compression of if there exists an orthogonal projection ''P'' onto a subspace of dimension ''m'' such that ''PAP*'' = ''B''. The Cauchy interlacing theorem states: :Theorem. If the eigenvalues of are , and those of ''B'' are , then for all , :: This can be proven using the min-max principle. Let ''βi'' have corresponding eigenvector ''bi'' and ''Sj'' be the ''j'' dimensional subspace then : According to first part of min-max, On the other hand, if we define then : where the last inequality is given by the second part of min-max. When , we have , hence the name ''interlacing'' theorem.Compact operators
Let be aSelf-adjoint operators
The min-max theorem also applies to (possibly unbounded) self-adjoint operators.G. Teschl, Mathematical Methods in Quantum Mechanics (GSM 99) https://www.mat.univie.ac.at/~gerald/ftp/book-schroe/schroe.pdf Recall the essential spectrum is the spectrum without isolated eigenvalues of finite multiplicity. Sometimes we have some eigenvalues below the essential spectrum, and we would like to approximate the eigenvalues and eigenfunctions. :Theorem (Min-Max). Let ''A'' be self-adjoint, and let be the eigenvalues of ''A'' below the essential spectrum. Then . If we only have ''N'' eigenvalues and hence run out of eigenvalues, then we let (the bottom of the essential spectrum) for ''n>N'', and the above statement holds after replacing min-max with inf-sup. :Theorem (Max-Min). Let ''A'' be self-adjoint, and let be the eigenvalues of ''A'' below the essential spectrum. Then . If we only have ''N'' eigenvalues and hence run out of eigenvalues, then we let (the bottom of the essential spectrum) for ''n > N'', and the above statement holds after replacing max-min with sup-inf. The proofs use the following results about self-adjoint operators: :Theorem. Let ''A'' be self-adjoint. Then for if and only if . :Theorem. If ''A'' is self-adjoint, then and .See also
* Courant minimax principle * Max–min inequalityReferences
External links and citations to related work
* * * * {{Spectral theory Articles containing proofs Operator theory Spectral theory Theorems in functional analysis