HOME

TheInfoList



OR:

In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, the spectral radius of a
square matrix In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often ...
is the maximum of the absolute values of its
eigenvalues In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
. More generally, the spectral radius of a
bounded linear operator In functional analysis and operator theory, a bounded linear operator is a linear transformation L : X \to Y between topological vector spaces (TVSs) X and Y that maps bounded subsets of X to bounded subsets of Y. If X and Y are normed vect ...
is the
supremum In mathematics, the infimum (abbreviated inf; plural infima) of a subset S of a partially ordered set P is a greatest element in P that is less than or equal to each element of S, if such an element exists. Consequently, the term ''greatest l ...
of the absolute values of the elements of its
spectrum A spectrum (plural ''spectra'' or ''spectrums'') is a condition that is not limited to a specific set of values but can vary, without gaps, across a continuum. The word was first used scientifically in optics to describe the rainbow of colors i ...
. The spectral radius is often denoted by .


Definition


Matrices

Let be the eigenvalues of a matrix . The spectral radius of is defined as :\rho(A) = \max \left \. The spectral radius can be thought of as an infimum of all norms of a matrix. Indeed, on the one hand, \rho(A) \leqslant \, A\, for every natural matrix norm \, \cdot\, ; and on the other hand, Gelfand's formula states that \rho(A) = \lim_ \, A^k\, ^ . Both of these results are shown below. However, the spectral radius does not necessarily satisfy \, A\mathbf\, \leqslant \rho(A) \, \mathbf\, for arbitrary vectors \mathbf \in \mathbb^n . To see why, let r > 1 be arbitrary and consider the matrix : C_r = \begin 0 & r^ \\ r & 0 \end . The
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The chara ...
of C_r is \lambda^2 - 1 , so its eigenvalues are \ and thus \rho(C_r) = 1. However, C_r \mathbf_1 = r \mathbf_2. As a result, : \, C_r \mathbf_1 \, = r > 1 = \rho(C_r) \, \mathbf_1\, . As an illustration of Gelfand's formula, note that \, C_r^k\, ^ \to 1 as k \to \infty, since C_r^k = I if k is even and C_r^k = C_r if k is odd. A special case in which \, A\mathbf\, \leqslant \rho(A) \, \mathbf\, for all \mathbf \in \mathbb^n is when A is a
Hermitian matrix In mathematics, a Hermitian matrix (or self-adjoint matrix) is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the -th row and -th column is equal to the complex conjugate of the element in the -th ...
and \, \cdot\, is the
Euclidean norm Euclidean space is the fundamental space of geometry, intended to represent physical space. Originally, that is, in Euclid's ''Elements'', it was the three-dimensional space of Euclidean geometry, but in modern mathematics there are Euclidean s ...
. This is because any Hermitian Matrix is
diagonalizable In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique.) F ...
by a
unitary matrix In linear algebra, a complex square matrix is unitary if its conjugate transpose is also its inverse, that is, if U^* U = UU^* = UU^ = I, where is the identity matrix. In physics, especially in quantum mechanics, the conjugate transpose is ...
, and unitary matrices preserve vector length. As a result, : \, A\mathbf\, = \, U^*DU\mathbf\, = \, DU\mathbf\, \leqslant \rho(A) \, U\mathbf\, = \rho(A) \, \mathbf\, .


Bounded linear operators

In the context of a
bounded linear operator In functional analysis and operator theory, a bounded linear operator is a linear transformation L : X \to Y between topological vector spaces (TVSs) X and Y that maps bounded subsets of X to bounded subsets of Y. If X and Y are normed vect ...
on a
Banach space In mathematics, more specifically in functional analysis, a Banach space (pronounced ) is a complete normed vector space. Thus, a Banach space is a vector space with a metric that allows the computation of vector length and distance between vector ...
, the eigenvalues need to be replaced with the elements of the spectrum of the operator, i.e. the values \lambda for which A - \lambda I is not bijective. We denote the spectrum by :\sigma(A) = \left\ The spectral radius is then defined as the supremum of the magnitudes of the elements of the spectrum: :\rho(A) = \sup_ , \lambda, Gelfand's formula, also known as the spectral radius formula, also holds for bounded linear operators: letting \, \cdot\, denote the
operator norm In mathematics, the operator norm measures the "size" of certain linear operators by assigning each a real number called its . Formally, it is a norm defined on the space of bounded linear operators between two given normed vector spaces. Introdu ...
, we have :\rho(A) = \lim_\, A^k\, ^=\inf_ \, A^k\, ^. A bounded operator (on a complex Hilbert space) is called a spectraloid operator if its spectral radius coincides with its
numerical radius In the mathematical field of linear algebra and convex analysis, the numerical range or field of values of a complex n \times n matrix ''A'' is the set :W(A) = \left\ where \mathbf^* denotes the conjugate transpose of the vector \mathbf. The num ...
. An example of such an operator is a
normal operator In mathematics, especially functional analysis, a normal operator on a complex Hilbert space ''H'' is a continuous linear operator ''N'' : ''H'' → ''H'' that commutes with its hermitian adjoint ''N*'', that is: ''NN*'' = ''N*N''. Normal opera ...
.


Graphs

The spectral radius of a finite
graph Graph may refer to: Mathematics *Graph (discrete mathematics), a structure made of vertices and edges **Graph theory, the study of such graphs and their properties *Graph (topology), a topological space resembling a graph in the sense of discre ...
is defined to be the spectral radius of its
adjacency matrix In graph theory and computer science, an adjacency matrix is a square matrix used to represent a finite graph. The elements of the matrix indicate whether pairs of vertices are adjacent or not in the graph. In the special case of a finite simp ...
. This definition extends to the case of infinite graphs with bounded degrees of vertices (i.e. there exists some real number such that the degree of every vertex of the graph is smaller than ). In this case, for the graph define: : \ell^2(G) = \left \. Let be the adjacency operator of : : \begin \gamma : \ell^2(G) \to \ell^2(G) \\ (\gamma f)(v) = \sum_ f(u) \end The spectral radius of is defined to be the spectral radius of the bounded linear operator .


Upper bounds


Upper bounds on the spectral radius of a matrix

The following proposition gives simple yet useful upper bounds on the spectral radius of a matrix. Proposition. Let with spectral radius and a consistent matrix norm . Then for each integer k \geqslant 1: ::\rho(A)\leq \, A^k\, ^. Proof Let be an
eigenvector In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
-
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
pair for a matrix ''A''. By the sub-multiplicativity of the matrix norm, we get: :, \lambda, ^k\, \mathbf\, = \, \lambda^k \mathbf\, = \, A^k \mathbf\, \leq \, A^k\, \cdot\, \mathbf\, . Since , we have :, \lambda, ^k \leq \, A^k\, and therefore :\rho(A)\leq \, A^k\, ^. concluding the proof.


Upper bounds for spectral radius of a graph

There are many upper bounds for the spectral radius of a graph in terms of its number ''n'' of vertices and its number ''m'' of edges. For instance, if :\frac \leq m-n \leq \frac where 3 \le k \le n is an integer, then :\rho(G) \leq \sqrt


Power sequence

The spectral radius is closely related to the behavior of the convergence of the power sequence of a matrix; namely as shown by the following theorem. Theorem. Let with spectral radius . Then if and only if :\lim_ A^k = 0. On the other hand, if , \lim_ \, A^k\, = \infty. The statement holds for any choice of matrix norm on . Proof Assume that A^k goes to zero as k goes to infinity. We will show that . Let be an
eigenvector In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
-
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
pair for ''A''. Since , we have :\begin 0 &= \left(\lim_ A^k \right) \mathbf \\ &= \lim_ \left(A^k\mathbf \right ) \\ &= \lim_ \lambda^k\mathbf \\ &= \mathbf \lim_ \lambda^k \end Since by hypothesis, we must have :\lim_\lambda^k = 0, which implies , λ, < 1. Since this must be true for any eigenvalue λ, we can conclude that . Now, assume the radius of is less than . From the
Jordan normal form In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to so ...
theorem, we know that for all , there exist with non-singular and block diagonal such that: :A = VJV^ with :J=\begin J_(\lambda_1) & 0 & 0 & \cdots & 0 \\ 0 & J_(\lambda_2) & 0 & \cdots & 0 \\ \vdots & \cdots & \ddots & \cdots & \vdots \\ 0 & \cdots & 0 & J_(\lambda_) & 0 \\ 0 & \cdots & \cdots & 0 & J_(\lambda_s) \end where :J_(\lambda_i)=\begin \lambda_i & 1 & 0 & \cdots & 0 \\ 0 & \lambda_i & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_i & 1 \\ 0 & 0 & \cdots & 0 & \lambda_i \end\in \mathbf^, 1\leq i\leq s. It is easy to see that :A^k=VJ^kV^ and, since is block-diagonal, :J^k=\begin J_^k(\lambda_1) & 0 & 0 & \cdots & 0 \\ 0 & J_^k(\lambda_2) & 0 & \cdots & 0 \\ \vdots & \cdots & \ddots & \cdots & \vdots \\ 0 & \cdots & 0 & J_^k(\lambda_) & 0 \\ 0 & \cdots & \cdots & 0 & J_^k(\lambda_s) \end Now, a standard result on the -power of an m_i \times m_i Jordan block states that, for k \geq m_i-1: :J_^k(\lambda_i)=\begin \lambda_i^k & \lambda_i^ & \lambda_i^ & \cdots & \lambda_i^ \\ 0 & \lambda_i^k & \lambda_i^ & \cdots & \lambda_i^ \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_i^k & \lambda_i^ \\ 0 & 0 & \cdots & 0 & \lambda_i^k \end Thus, if \rho(A) < 1 then for all , \lambda_i, < 1. Hence for all we have: :\lim_J_^k=0 which implies :\lim_ J^k = 0. Therefore, :\lim_A^k=\lim_VJ^kV^=V \left (\lim_J^k \right )V^=0 On the other side, if \rho(A)>1, there is at least one element in that does not remain bounded as increases, thereby proving the second part of the statement.


Gelfand's formula

Gelfand's formula, named after
Israel Gelfand Israel Moiseevich Gelfand, also written Israïl Moyseyovich Gel'fand, or Izrail M. Gelfand ( yi, ישראל געלפֿאַנד, russian: Изра́иль Моисе́евич Гельфа́нд, uk, Ізраїль Мойсейович Гел ...
, gives the spectral radius as a limit of matrix norms.


Theorem

For any
matrix norm In mathematics, a matrix norm is a vector norm in a vector space whose elements (vectors) are matrices (of given dimensions). Preliminaries Given a field K of either real or complex numbers, let K^ be the -vector space of matrices with m rows ...
we haveThe formula holds for any
Banach algebra In mathematics, especially functional analysis, a Banach algebra, named after Stefan Banach, is an associative algebra A over the real or complex numbers (or over a non-Archimedean complete normed field) that at the same time is also a Banach ...
; see Lemma IX.1.8 in and
:\rho(A)=\lim_ \left \, A^k \right \, ^. Moreover, in the case of a
consistent In classical deductive logic, a consistent theory is one that does not lead to a logical contradiction. The lack of contradiction can be defined in either semantic or syntactic terms. The semantic definition states that a theory is consistent i ...
matrix norm \lim_ \left \, A^k \right \, ^ approaches \rho(A) from above (indeed, in that case \rho(A) \leq \left \, A^k \right \, ^ for all k).


Proof

For any , let us define the two following matrices: :A_= \fracA. Thus, :\rho \left (A_ \right ) = \frac, \qquad \rho (A_+) < 1 < \rho (A_-). We start by applying the previous theorem on limits of power sequences to : :\lim_ A_+^k=0. This shows the existence of such that, for all , :\left\, A_+^k \right \, < 1. Therefore, :\left \, A^k \right \, ^ < \rho(A)+\varepsilon. Similarly, the theorem on power sequences implies that \, A_-^k\, is not bounded and that there exists such that, for all ''k'' ≥ N, :\left\, A_-^k \right \, > 1. Therefore, :\left\, A^k \right\, ^ > \rho(A)-\varepsilon. Let . Then, :\forall \varepsilon>0\quad \exists N\in\mathbf \quad \forall k\geq N \quad \rho(A)-\varepsilon < \left \, A^k \right \, ^ < \rho(A)+\varepsilon, that is, :\lim_ \left \, A^k \right \, ^ = \rho(A). This concludes the proof.


Corollary

Gelfand's formula yields a bound on the spectral radius of a product of commuting matrices: if A_1, \ldots, A_n are matrices that all commute, then :\rho(A_1 \cdots A_n) \leq \rho(A_1) \cdots \rho(A_n).


Numerical example

Consider the matrix :A=\begin 9 & -1 & 2\\ -2 & 8 & 4\\ 1 & 1 & 8 \end whose eigenvalues are ; by definition, . In the following table, the values of \, A^k\, ^ for the four most used norms are listed versus several increasing values of k (note that, due to the particular form of this matrix,\, .\, _1=\, .\, _\infty):


Notes and references


Bibliography

* *


See also

*
Spectral gap In mathematics, the spectral gap is the difference between the moduli of the two largest eigenvalues of a matrix or operator; alternately, it is sometimes taken as the smallest non-zero eigenvalue. Various theorems relate this difference to othe ...
* The
Joint spectral radius In mathematics, the joint spectral radius is a generalization of the classical notion of spectral radius of a matrix, to sets of matrices. In recent years this notion has found applications in a large number of engineering fields and is still a topi ...
is a generalization of the spectral radius to sets of matrices. * Spectrum of a matrix *
Spectral abscissa In mathematics, the spectral abscissa of a matrix or a bounded linear operator is the greatest real part of the matrix's spectrum (its set of eigenvalues). It is sometimes denoted \alpha(A). As a transformation \alpha: \Mu^n \rightarrow \Reals , ...
{{SpectralTheory Spectral theory Articles containing proofs