Spectral Radius
In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues. More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectrum. The spectral radius is often denoted by . Definition Matrices Let be the eigenvalues of a matrix . The spectral radius of is defined as :\rho(A) = \max \left \. The spectral radius can be thought of as an infimum of all norms of a matrix. Indeed, on the one hand, \rho(A) \leqslant \, A\, for every natural matrix norm \, \cdot\, ; and on the other hand, Gelfand's formula states that \rho(A) = \lim_ \, A^k\, ^ . Both of these results are shown below. However, the spectral radius does not necessarily satisfy \, A\mathbf\, \leqslant \rho(A) \, \mathbf\, for arbitrary vectors \mathbf \in \mathbb^n . To see why, let r > 1 be arbitrary and consider the matrix : C_r = \begin 0 & r^ \\ r & 0 \end . The characteristic polynomial ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Numerical Radius
In the mathematical field of linear algebra and convex analysis, the numerical range or field of values of a complex n \times n matrix ''A'' is the set :W(A) = \left\ where \mathbf^* denotes the conjugate transpose of the vector \mathbf. The numerical range includes, in particular, the diagonal entries of the matrix (obtained by choosing ''x'' equal to the unit vectors along the coordinate axes) and the eigenvalues of the matrix (obtained by choosing ''x'' equal to the eigenvectors). In engineering, numerical ranges are used as a rough estimate of eigenvalues of ''A''. Recently, generalizations of the numerical range are used to study quantum computing. A related concept is the numerical radius, which is the largest absolute value of the numbers in the numerical range, i.e. :r(A) = \sup \ = \sup_ , \langle Ax, x \rangle, . Properties # The numerical range is the range of the Rayleigh quotient. # (Hausdorff–Toeplitz theorem) The numerical range is convex and compact. # W(\al ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Spectral Abscissa
In mathematics, the spectral abscissa of a matrix or a bounded linear operator is the greatest real part of the matrix's spectrum (its set of eigenvalues). It is sometimes denoted \alpha(A). As a transformation \alpha: \Mu^n \rightarrow \Reals , the spectral abscissa maps a square matrix onto its largest real eigenvalue. Matrices Let λ1, ..., λ''s'' be the (real or complex) eigenvalues of a matrix ''A'' ∈ C''n'' × ''n''. Then its spectral abscissa is defined as: :\alpha(A) = \max_i\ \, In stability theory, a continuous system represented by matrix A is said to be stable if all real parts of its eigenvalues are negative, i.e. \alpha(A)<0. Analogously, in , the solution to the differential equation is stable under the same condition . See also |
|
Spectrum Of A Matrix
In mathematics, the spectrum of a matrix is the set of its eigenvalues. More generally, if T\colon V \to V is a linear operator on any finite-dimensional vector space, its spectrum is the set of scalars \lambda such that T-\lambda I is not invertible. The determinant of the matrix equals the product of its eigenvalues. Similarly, the trace of the matrix equals the sum of its eigenvalues. From this point of view, we can define the pseudo-determinant for a singular matrix to be the product of its nonzero eigenvalues (the density of multivariate normal distribution will need this quantity). In many applications, such as PageRank, one is interested in the dominant eigenvalue, i.e. that which is largest in absolute value. In other applications, the smallest eigenvalue is important, but in general, the whole spectrum provides valuable information about a matrix. Definition Let ''V'' be a finite-dimensional vector space over some field ''K'' and suppose ''T'' : ''V'' → ''V'' is a lin ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Joint Spectral Radius
In mathematics, the joint spectral radius is a generalization of the classical notion of spectral radius of a matrix, to sets of matrices. In recent years this notion has found applications in a large number of engineering fields and is still a topic of active research. General description The joint spectral radius of a set of matrices is the maximal asymptotic growth rate of products of matrices taken in that set. For a finite (or more generally compact) set of matrices \mathcal M=\ \subset \mathbb R^, the joint spectral radius is defined as follows: : \rho (\mathcal M)= \lim_\max. \, It can be proved that the limit exists and that the quantity actually does not depend on the chosen matrix norm (this is true for any norm but particularly easy to see if the norm is sub-multiplicative). The joint spectral radius was introduced in 1960 by Gian-Carlo Rota and Gilbert Strang, two mathematicians from MIT, but started attracting attention with the work of Ingrid Daubechies and J ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Spectral Gap
In mathematics, the spectral gap is the difference between the moduli of the two largest eigenvalues of a matrix or operator; alternately, it is sometimes taken as the smallest non-zero eigenvalue. Various theorems relate this difference to other properties of the system. See also * Cheeger constant (graph theory) * Cheeger constant (Riemannian geometry) * Eigengap * Spectral gap (physics) * Spectral radius In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues. More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectru ... References External links * {{Mathanalysis-stub Spectral theory ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Banach Algebra
In mathematics, especially functional analysis, a Banach algebra, named after Stefan Banach, is an associative algebra A over the real or complex numbers (or over a non-Archimedean complete normed field) that at the same time is also a Banach space, that is, a normed space that is complete in the metric induced by the norm. The norm is required to satisfy \, x \, y\, \ \leq \, x\, \, \, y\, \quad \text x, y \in A. This ensures that the multiplication operation is continuous. A Banach algebra is called ''unital'' if it has an identity element for the multiplication whose norm is 1, and ''commutative'' if its multiplication is commutative. Any Banach algebra A (whether it has an identity element or not) can be embedded isometrically into a unital Banach algebra A_e so as to form a closed ideal of A_e. Often one assumes ''a priori'' that the algebra under consideration is unital: for one can develop much of the theory by considering A_e and then applying the outcome in the ori ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Israel Gelfand
Israel Moiseevich Gelfand, also written Israïl Moyseyovich Gel'fand, or Izrail M. Gelfand ( yi, ישראל געלפֿאַנד, russian: Изра́иль Моисе́евич Гельфа́нд, uk, Ізраїль Мойсейович Гельфанд; – 5 October 2009) was a prominent Soviet-American mathematician. He made significant contributions to many branches of mathematics, including group theory, representation theory and functional analysis. The recipient of many awards, including the Order of Lenin and the first Wolf Prize, he was a Foreign Fellow of the Royal Society and professor at Moscow State University and, after immigrating to the United States shortly before his 76th birthday, at Rutgers University. Gelfand is also a 1994 MacArthur Fellow. His legacy continues through his students, who include Endre Szemerédi, Alexandre Kirillov, Edward Frenkel, Joseph Bernstein, David Kazhdan, as well as his own son, Sergei Gelfand. Early years A native of Kherson Go ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Jordan Normal Form
In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to some basis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal (on the superdiagonal), and with identical diagonal entries to the left and below them. Let ''V'' be a vector space over a field ''K''. Then a basis with respect to which the matrix has the required form exists if and only if all eigenvalues of the matrix lie in ''K'', or equivalently if the characteristic polynomial of the operator splits into linear factors over ''K''. This condition is always satisfied if ''K'' is algebraically closed (for instance, if it is the field of complex numbers). The diagonal entries of the normal form are the eigenvalues (of the operator), and the number of times each eigenvalue occurs is called th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenvalue
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by \lambda, is the factor by which the eigenvector is scaled. Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated. Formal definition If is a linear transformation from a vector space over a field into itself and is a nonzero vector in , then is an eigenvector of if is a scalar multiple of . This can be written as T(\mathbf) = \lambda \mathbf, where is a scalar in , known as the eigenvalue, characteristic value, or characteristic root ass ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenvector
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by \lambda, is the factor by which the eigenvector is scaled. Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated. Formal definition If is a linear transformation from a vector space over a field into itself and is a nonzero vector in , then is an eigenvector of if is a scalar multiple of . This can be written as T(\mathbf) = \lambda \mathbf, where is a scalar in , known as the eigenvalue, characteristic value, or characteristic root ass ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Adjacency Matrix
In graph theory and computer science, an adjacency matrix is a square matrix used to represent a finite graph. The elements of the matrix indicate whether pairs of vertices are adjacent or not in the graph. In the special case of a finite simple graph, the adjacency matrix is a (0,1)-matrix with zeros on its diagonal. If the graph is undirected (i.e. all of its edges are bidirectional), the adjacency matrix is symmetric. The relationship between a graph and the eigenvalues and eigenvectors of its adjacency matrix is studied in spectral graph theory. The adjacency matrix of a graph should be distinguished from its incidence matrix, a different matrix representation whose elements indicate whether vertex–edge pairs are incident or not, and its degree matrix, which contains information about the degree of each vertex. Definition For a simple graph with vertex set , the adjacency matrix is a square matrix such that its element is one when there is an edge from vertex to ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |