HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices ...
, a generalized eigenvector of an n\times n
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
A is a
vector Vector most often refers to: *Euclidean vector, a quantity with a magnitude and a direction *Vector (epidemiology), an agent that carries and transmits an infectious pathogen into another living organism Vector may also refer to: Mathematic ...
which satisfies certain criteria which are more relaxed than those for an (ordinary)
eigenvector In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
. Let V be an n-dimensional
vector space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called '' vectors'', may be added together and multiplied ("scaled") by numbers called ''scalars''. Scalars are often real numbers, but can ...
; let \phi be a
linear map In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
in , the set of all linear maps from V into itself; and let A be the
matrix representation Matrix representation is a method used by a computer language to store matrix (mathematics), matrices of more than one dimension in computer storage, memory. Fortran and C (programming language), C use different schemes for their native arrays. Fo ...
of \phi with respect to some ordered
basis Basis may refer to: Finance and accounting * Adjusted basis, the net cost of an asset after adjusting for various tax-related items *Basis point, 0.01%, often used in the context of interest rates * Basis trading, a trading strategy consisting ...
. There may not always exist a full set of n
linearly independent In the theory of vector spaces, a set of vectors is said to be if there is a nontrivial linear combination of the vectors that equals the zero vector. If no such linear combination exists, then the vectors are said to be . These concepts are ...
eigenvectors of A that form a complete basis for V. That is, the matrix A may not be
diagonalizable In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique.) F ...
. This happens when the
algebraic multiplicity In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
of at least one
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
\lambda_i is greater than its
geometric multiplicity In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
(the nullity of the matrix (A-\lambda_i I), or the
dimension In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coor ...
of its
nullspace In mathematics, the kernel of a linear map, also known as the null space or nullspace, is the linear subspace of the domain of the map which is mapped to the zero vector. That is, given a linear map between two vector spaces and , the kernel of ...
). In this case, \lambda_i is called a defective eigenvalue and A is called a
defective matrix In linear algebra, a defective matrix is a square matrix that does not have a complete basis of eigenvectors, and is therefore not diagonalizable. In particular, an ''n'' × ''n'' matrix is defective if and only if it does not hav ...
. A generalized eigenvector x_i corresponding to \lambda_i, together with the matrix (A-\lambda_i I) generate a Jordan chain of linearly independent generalized eigenvectors which form a basis for an
invariant subspace In mathematics, an invariant subspace of a linear mapping ''T'' : ''V'' → ''V '' i.e. from some vector space ''V'' to itself, is a subspace ''W'' of ''V'' that is preserved by ''T''; that is, ''T''(''W'') ⊆ ''W''. General descri ...
of V. Using generalized eigenvectors, a set of linearly independent eigenvectors of A can be extended, if necessary, to a complete basis for V. This basis can be used to determine an "almost diagonal matrix" J in
Jordan normal form In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to so ...
, similar to A, which is useful in computing certain
matrix function In mathematics, every analytic function can be used for defining a matrix function that maps square matrices with complex entries to square matrices of the same size. This is used for defining the exponential of a matrix, which is involved in th ...
s of A. The matrix J is also useful in solving the
system of linear differential equations In mathematics, a linear differential equation is a differential equation that is defined by a linear polynomial in the unknown function and its derivatives, that is an equation of the form :a_0(x)y + a_1(x)y' + a_2(x)y'' \cdots + a_n(x)y^ = ...
\mathbf x' = A \mathbf x, where A need not be diagonalizable. The dimension of the generalized eigenspace corresponding to a given eigenvalue \lambda is the algebraic multiplicity of \lambda.


Overview and definition

There are several equivalent ways to define an ordinary eigenvector. For our purposes, an eigenvector \mathbf u associated with an eigenvalue \lambda of an n × n matrix A is a nonzero vector for which (A - \lambda I) \mathbf u = \mathbf 0, where I is the n × n identity matrix and \mathbf 0 is the
zero vector In mathematics, a zero element is one of several generalizations of 0, the number zero to other algebraic structures. These alternate meanings may or may not reduce to the same thing, depending on the context. Additive identities An additive iden ...
of length n. That is, \mathbf u is in the
kernel Kernel may refer to: Computing * Kernel (operating system), the central component of most operating systems * Kernel (image processing), a matrix used for image convolution * Compute kernel, in GPGPU programming * Kernel method, in machine learn ...
of the
transformation Transformation may refer to: Science and mathematics In biology and medicine * Metamorphosis, the biological process of changing physical form after birth or hatching * Malignant transformation, the process of cells becoming cancerous * Tran ...
(A - \lambda I). If A has n linearly independent eigenvectors, then A is similar to a diagonal matrix D. That is, there exists an
invertible matrix In linear algebra, an -by- square matrix is called invertible (also nonsingular or nondegenerate), if there exists an -by- square matrix such that :\mathbf = \mathbf = \mathbf_n \ where denotes the -by- identity matrix and the multiplicati ...
M such that A is diagonalizable through the similarity transformation D = M^AM. The matrix D is called a
spectral matrix In linear algebra, the modal matrix is used in the diagonalization process involving eigenvalues and eigenvectors. Specifically the modal matrix M for the matrix A is the ''n'' × ''n'' matrix formed with the eigenvectors of A as columns in M. It ...
for A. The matrix M is called a
modal matrix In linear algebra, the modal matrix is used in the diagonalization process involving eigenvalues and eigenvectors. Specifically the modal matrix M for the matrix A is the ''n'' × ''n'' matrix formed with the eigenvectors of A as columns in M. It ...
for A. Diagonalizable matrices are of particular interest since matrix functions of them can be computed easily. On the other hand, if A does not have n linearly independent eigenvectors associated with it, then A is not diagonalizable. Definition: A vector \mathbf x_m is a generalized eigenvector of rank ''m'' of the matrix A and corresponding to the eigenvalue \lambda if :(A - \lambda I)^m \mathbf x_m = \mathbf 0 but :(A - \lambda I)^ \mathbf x_m \ne \mathbf 0. Clearly, a generalized eigenvector of rank 1 is an ordinary eigenvector. Every n × n matrix A has n linearly independent generalized eigenvectors associated with it and can be shown to be similar to an "almost diagonal" matrix J in Jordan normal form. That is, there exists an invertible matrix M such that J = M^AM. The matrix M in this case is called a
generalized modal matrix In linear algebra, the modal matrix is used in the diagonalization process involving eigenvalues and eigenvectors. Specifically the modal matrix M for the matrix A is the ''n'' × ''n'' matrix formed with the eigenvectors of A as columns in M. It ...
for A. If \lambda is an eigenvalue of algebraic multiplicity \mu, then A will have \mu linearly independent generalized eigenvectors corresponding to \lambda. These results, in turn, provide a straightforward method for computing certain matrix functions of A. Note: For an n \times n matrix A over a
field Field may refer to: Expanses of open ground * Field (agriculture), an area of land used for agricultural purposes * Airfield, an aerodrome that lacks the infrastructure of an airport * Battlefield * Lawn, an area of mowed grass * Meadow, a grass ...
F to be expressed in Jordan normal form, all eigenvalues of A must be in F. That is, the characteristic polynomial f(x) must factor completely into linear factors. For example, if A has
real-valued In mathematics, value may refer to several, strongly related notions. In general, a mathematical value may be any definite mathematical object. In elementary mathematics, this is most often a number – for example, a real number such as or an i ...
elements, then it may be necessary for the eigenvalues and the components of the eigenvectors to have complex values. The set spanned by all generalized eigenvectors for a given \lambda , forms the generalized eigenspace for \lambda .


Examples

Here are some examples to illustrate the concept of generalized eigenvectors. Some of the details will be described later.


Example 1

This example is simple but clearly illustrates the point. This type of matrix is used frequently in textbooks. Suppose : A = \begin 1 & 1\\ 0 & 1 \end. Then there is only one eigenvalue, \lambda = 1, and its algebraic multiplicity is ''m'' = 2. Notice that this matrix is in Jordan normal form but is not
diagonal In geometry, a diagonal is a line segment joining two vertices of a polygon or polyhedron, when those vertices are not on the same edge. Informally, any sloping line is called diagonal. The word ''diagonal'' derives from the ancient Greek δΠ...
. Hence, this matrix is not diagonalizable. Since there is one superdiagonal entry, there will be one generalized eigenvector of rank greater than 1 (or one could note that the vector space V is of dimension 2, so there can be at most one generalized eigenvector of rank greater than 1). Alternatively, one could compute the dimension of the
nullspace In mathematics, the kernel of a linear map, also known as the null space or nullspace, is the linear subspace of the domain of the map which is mapped to the zero vector. That is, given a linear map between two vector spaces and , the kernel of ...
of A - \lambda I to be ''p'' = 1, and thus there are ''m'' – ''p'' = 1 generalized eigenvectors of rank greater than 1. The ordinary eigenvector \mathbf v_1=\begin1 \\0 \end is computed as usual (see the
eigenvector In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
page for examples). Using this eigenvector, we compute the generalized eigenvector \mathbf v_2 by solving : (A-\lambda I) \mathbf v_2 = \mathbf v_1. Writing out the values: : \left(\begin 1 & 1\\ 0 & 1 \end - 1 \begin 1 & 0\\ 0 & 1 \end\right)\beginv_ \\v_ \end = \begin 0 & 1\\ 0 & 0 \end \beginv_ \\v_ \end = \begin1 \\0 \end. This simplifies to : v_= 1. The element v_ has no restrictions. The generalized eigenvector of rank 2 is then \mathbf v_2=\begina \\1 \end, where ''a'' can have any scalar value. The choice of ''a'' = 0 is usually the simplest. Note that : (A-\lambda I) \mathbf v_2 = \begin 0 & 1\\ 0 & 0 \end \begina \\1 \end = \begin1 \\0 \end = \mathbf v_1, so that \mathbf v_2 is a generalized eigenvector, : (A-\lambda I) \mathbf v_1 = \begin 0 & 1\\ 0 & 0 \end \begin1 \\0 \end = \begin0 \\0 \end = \mathbf 0, so that \mathbf v_1 is an ordinary eigenvector, and that \mathbf v_1 and \mathbf v_2 are linearly independent and hence constitute a basis for the vector space V .


Example 2

This example is more complex than Example 1. Unfortunately, it is a little difficult to construct an interesting example of low order. The matrix :A = \begin 1 & 0 & 0 & 0 & 0 \\ 3 & 1 & 0 & 0 & 0 \\ 6 & 3 & 2 & 0 & 0 \\ 10 & 6 & 3 & 2 & 0 \\ 15 & 10 & 6 & 3 & 2 \end has ''eigenvalues'' \lambda_1 = 1 and \lambda_2 = 2 with ''algebraic multiplicities'' \mu_1 = 2 and \mu_2 = 3 , but ''geometric multiplicities'' \gamma_1 = 1 and \gamma_2 = 1. The ''generalized eigenspaces'' of A are calculated below. \mathbf x_1 is the ordinary eigenvector associated with \lambda_1 . \mathbf x_2 is a generalized eigenvector associated with \lambda_1 . \mathbf y_1 is the ordinary eigenvector associated with \lambda_2 . \mathbf y_2 and \mathbf y_3 are generalized eigenvectors associated with \lambda_2 . :(A-1 I) \mathbf x_1 = \begin 0 & 0 & 0 & 0 & 0 \\ 3 & 0 & 0 & 0 & 0 \\ 6 & 3 & 1 & 0 & 0 \\ 10 & 6 & 3 & 1 & 0 \\ 15 & 10 & 6 & 3 & 1 \end\begin 0 \\ 3 \\ -9 \\ 9 \\ -3 \end = \begin 0 \\ 0 \\ 0 \\ 0 \\ 0 \end = \mathbf 0 , :(A - 1 I) \mathbf x_2 = \begin 0 & 0 & 0 & 0 & 0 \\ 3 & 0 & 0 & 0 & 0 \\ 6 & 3 & 1 & 0 & 0 \\ 10 & 6 & 3 & 1 & 0 \\ 15 & 10 & 6 & 3 & 1 \end \begin 1 \\ -15 \\ 30 \\ -1 \\ -45 \end = \begin 0 \\ 3 \\ -9 \\ 9 \\ -3 \end = \mathbf x_1 , :(A - 2 I) \mathbf y_1 = \begin -1 & 0 & 0 & 0 & 0 \\ 3 & -1 & 0 & 0 & 0 \\ 6 & 3 & 0 & 0 & 0 \\ 10 & 6 & 3 & 0 & 0 \\ 15 & 10 & 6 & 3 & 0 \end \begin 0 \\ 0 \\ 0 \\ 0 \\ 9 \end = \begin 0 \\ 0 \\ 0 \\ 0 \\ 0 \end = \mathbf 0 , :(A - 2 I) \mathbf y_2 = \begin -1 & 0 & 0 & 0 & 0 \\ 3 & -1 & 0 & 0 & 0 \\ 6 & 3 & 0 & 0 & 0 \\ 10 & 6 & 3 & 0 & 0 \\ 15 & 10 & 6 & 3 & 0 \end \begin 0 \\ 0 \\ 0 \\ 3 \\ 0 \end = \begin 0 \\ 0 \\ 0 \\ 0 \\ 9 \end = \mathbf y_1 , :(A - 2 I) \mathbf y_3 = \begin -1 & 0 & 0 & 0 & 0 \\ 3 & -1 & 0 & 0 & 0 \\ 6 & 3 & 0 & 0 & 0 \\ 10 & 6 & 3 & 0 & 0 \\ 15 & 10 & 6 & 3 & 0 \end \begin 0 \\ 0 \\ 1 \\ -2 \\ 0 \end = \begin 0 \\ 0 \\ 0 \\ 3 \\ 0 \end = \mathbf y_2 . This results in a basis for each of the ''generalized eigenspaces'' of A. Together the two ''chains'' of generalized eigenvectors span the space of all 5-dimensional column vectors. : \left\ = \left\, \left\ = \left\. An "almost diagonal" matrix J in ''Jordan normal form'', similar to A is obtained as follows: : M = \begin \mathbf x_1 & \mathbf x_2 & \mathbf y_1 & \mathbf y_2 & \mathbf y_3 \end = \begin 0 & 1 & 0 &0& 0 \\ 3 & -15 & 0 &0& 0 \\ -9 & 30 & 0 &0& 1 \\ 9 & -1 & 0 &3& -2 \\ -3 & -45 & 9 &0& 0 \end, :J = \begin 1 & 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 2 & 1 & 0 \\ 0 & 0 & 0 & 2 & 1 \\ 0 & 0 & 0 & 0 & 2 \end, where M is a
generalized modal matrix In linear algebra, the modal matrix is used in the diagonalization process involving eigenvalues and eigenvectors. Specifically the modal matrix M for the matrix A is the ''n'' × ''n'' matrix formed with the eigenvectors of A as columns in M. It ...
for A, the columns of M are a
canonical basis In mathematics, a canonical basis is a basis of an algebraic structure that is canonical in a sense that depends on the precise context: * In a coordinate space, and more generally in a free module, it refers to the standard basis defined by the ...
for A, and AM = MJ.


Jordan chains

Definition: Let \mathbf x_m be a generalized eigenvector of rank ''m'' corresponding to the matrix A and the eigenvalue \lambda. The chain generated by \mathbf x_m is a set of vectors \left\ given by Thus, in general, The vector \mathbf x_j , given by (), is a generalized eigenvector of rank ''j'' corresponding to the eigenvalue \lambda. A chain is a linearly independent set of vectors.


Canonical basis

Definition: A set of ''n'' linearly independent generalized eigenvectors is a canonical basis if it is composed entirely of Jordan chains. Thus, once we have determined that a generalized eigenvector of rank ''m'' is in a canonical basis, it follows that the ''m'' − 1 vectors \mathbf x_, \mathbf x_, \ldots , \mathbf x_1 that are in the Jordan chain generated by \mathbf x_m are also in the canonical basis. Let \lambda_i be an eigenvalue of A of algebraic multiplicity \mu_i . First, find the ranks (matrix ranks) of the matrices (A - \lambda_i I), (A - \lambda_i I)^2, \ldots , (A - \lambda_i I)^ . The integer m_i is determined to be the ''first integer'' for which (A - \lambda_i I)^ has rank n - \mu_i (''n'' being the number of rows or columns of A, that is, A is ''n'' × ''n''). Now define : \rho_k = \operatorname(A - \lambda_i I)^ - \operatorname(A - \lambda_i I)^k \qquad (k = 1, 2, \ldots , m_i). The variable \rho_k designates the number of linearly independent generalized eigenvectors of rank ''k'' corresponding to the eigenvalue \lambda_i that will appear in a canonical basis for A. Note that : \operatorname(A - \lambda_i I)^0 = \operatorname(I) = n .


Computation of generalized eigenvectors

In the preceding sections we have seen techniques for obtaining the n linearly independent generalized eigenvectors of a canonical basis for the vector space V associated with an n \times n matrix A. These techniques can be combined into a procedure: :Solve the characteristic equation of A for eigenvalues \lambda_i and their algebraic multiplicities \mu_i ; :For each \lambda_i : ::Determine n - \mu_i; ::Determine m_i; ::Determine \rho_k for (k = 1, \ldots , m_i); ::Determine each Jordan chain for \lambda_i;


Example 3

The matrix : A = \begin 5 & 1 & -2 & 4 \\ 0 & 5 & 2 & 2 \\ 0 & 0 & 5 & 3 \\ 0 & 0 & 0 & 4 \end has an eigenvalue \lambda_1 = 5 of algebraic multiplicity \mu_1 = 3 and an eigenvalue \lambda_2 = 4 of algebraic multiplicity \mu_2 = 1. We also have n=4. For \lambda_1 we have n - \mu_1 = 4 - 3 = 1. : (A - 5I) = \begin 0 & 1 & -2 & 4 \\ 0 & 0 & 2 & 2 \\ 0 & 0 & 0 & 3 \\ 0 & 0 & 0 & -1 \end, \qquad \operatorname(A - 5I) = 3. : (A - 5I)^2 = \begin 0 & 0 & 2 & -8 \\ 0 & 0 & 0 & 4 \\ 0 & 0 & 0 & -3 \\ 0 & 0 & 0 & 1 \end, \qquad \operatorname(A - 5I)^2 = 2. : (A - 5I)^3 = \begin 0 & 0 & 0 & 14 \\ 0 & 0 & 0 & -4 \\ 0 & 0 & 0 & 3 \\ 0 & 0 & 0 & -1 \end, \qquad \operatorname(A - 5I)^3 = 1. The first integer m_1 for which (A - 5I)^ has rank n - \mu_1 = 1 is m_1 = 3. We now define : \rho_3 = \operatorname(A - 5I)^2 - \operatorname(A - 5I)^3 = 2 - 1 = 1 , : \rho_2 = \operatorname(A - 5I)^1 - \operatorname(A - 5I)^2 = 3 - 2 = 1 , : \rho_1 = \operatorname(A - 5I)^0 - \operatorname(A - 5I)^1 = 4 - 3 = 1 . Consequently, there will be three linearly independent generalized eigenvectors; one each of ranks 3, 2 and 1. Since \lambda_1 corresponds to a single chain of three linearly independent generalized eigenvectors, we know that there is a generalized eigenvector \mathbf x_3 of rank 3 corresponding to \lambda_1 such that but Equations () and () represent linear systems that can be solved for \mathbf x_3 . Let : \mathbf x_3 = \begin x_ \\ x_ \\ x_ \\ x_ \end. Then : (A - 5I)^3 \mathbf x_3 = \begin 0 & 0 & 0 & 14 \\ 0 & 0 & 0 & -4 \\ 0 & 0 & 0 & 3 \\ 0 & 0 & 0 & -1 \end \begin x_ \\ x_ \\ x_ \\ x_ \end = \begin 14 x_ \\ -4 x_ \\ 3 x_ \\ - x_ \end = \begin 0 \\ 0 \\ 0 \\ 0 \end and : (A - 5I)^2 \mathbf x_3 = \begin 0 & 0 & 2 & -8 \\ 0 & 0 & 0 & 4 \\ 0 & 0 & 0 & -3 \\ 0 & 0 & 0 & 1 \end \begin x_ \\ x_ \\ x_ \\ x_ \end = \begin 2 x_ - 8 x_ \\ 4 x_ \\ -3 x_ \\ x_ \end \ne \begin 0 \\ 0 \\ 0 \\ 0 \end. Thus, in order to satisfy the conditions () and (), we must have x_ = 0 and x_ \ne 0. No restrictions are placed on x_ and x_. By choosing x_ = x_ = x_ = 0, x_ = 1, we obtain : \mathbf x_3 = \begin 0 \\ 0 \\ 1 \\ 0 \end as a generalized eigenvector of rank 3 corresponding to \lambda_1 = 5 . Note that it is possible to obtain infinitely many other generalized eigenvectors of rank 3 by choosing different values of x_, x_ and x_, with x_ \ne 0. Our first choice, however, is the simplest. Now using equations (), we obtain \mathbf x_2 and \mathbf x_1 as generalized eigenvectors of rank 2 and 1, respectively, where : \mathbf x_2 = (A - 5I) \mathbf x_3 = \begin -2 \\ 2 \\ 0 \\ 0 \end, and : \mathbf x_1 = (A - 5I) \mathbf x_2 = \begin 2 \\ 0 \\ 0 \\ 0 \end. The simple eigenvalue \lambda_2 = 4 can be dealt with using standard techniques and has an ordinary eigenvector : \mathbf y_1 = \begin -14 \\ 4 \\ -3 \\ 1 \end. A canonical basis for A is : \left\ = \left\. \mathbf x_1, \mathbf x_2 and \mathbf x_3 are generalized eigenvectors associated with \lambda_1 , while \mathbf y_1 is the ordinary eigenvector associated with \lambda_2 . This is a fairly simple example. In general, the numbers \rho_k of linearly independent generalized eigenvectors of rank k will not always be equal. That is, there may be several chains of different lengths corresponding to a particular eigenvalue.


Generalized modal matrix

Let A be an ''n'' × ''n'' matrix. A generalized modal matrix M for A is an ''n'' × ''n'' matrix whose columns, considered as vectors, form a canonical basis for A and appear in M according to the following rules: * All Jordan chains consisting of one vector (that is, one vector in length) appear in the first columns of M. * All vectors of one chain appear together in adjacent columns of M. * Each chain appears in M in order of increasing rank (that is, the generalized eigenvector of rank 1 appears before the generalized eigenvector of rank 2 of the same chain, which appears before the generalized eigenvector of rank 3 of the same chain, etc.).


Jordan normal form

Let V be an ''n''-dimensional vector space; let \phi be a linear map in , the set of all linear maps from V into itself; and let A be the matrix representation of \phi with respect to some ordered basis. It can be shown that if the characteristic polynomial f(\lambda) of A factors into linear factors, so that f(\lambda) has the form : f(\lambda) = \pm (\lambda - \lambda_1)^(\lambda - \lambda_2)^ \cdots (\lambda - \lambda_r)^ , where \lambda_1, \lambda_2, \ldots , \lambda_r are the distinct eigenvalues of A, then each \mu_i is the algebraic multiplicity of its corresponding eigenvalue \lambda_i and A is similar to a matrix J in Jordan normal form, where each \lambda_i appears \mu_i consecutive times on the diagonal, and the entry directly above each \lambda_i (that is, on the superdiagonal) is either 0 or 1. All other entries (that is, off the diagonal and superdiagonal) are 0. More precisely, J is a
Jordan matrix In the mathematical discipline of matrix theory, a Jordan matrix, named after Camille Jordan, is a block diagonal matrix over a ring (whose identities are the zero 0 and one 1), where each block along the diagonal, called a Jordan block, has the ...
whose Jordan blocks corresponding to the same eigenvalue are grouped together (but no ordering is imposed among the eigenvalues, nor among the blocks for a given eigenvalue). The matrix J is as close as one can come to a diagonalization of A. If A is diagonalizable, then all entries above the diagonal are zero. Note that some textbooks have the ones on the subdiagonal, that is, immediately below the main diagonal instead of on the superdiagonal. The eigenvalues are still on the main diagonal. Every ''n'' × ''n'' matrix A is similar to a matrix J in Jordan normal form, obtained through the similarity transformation J = M^AM , where M is a generalized modal matrix for A. (See
Note Note, notes, or NOTE may refer to: Music and entertainment * Musical note, a pitched sound (or a symbol for a sound) in music * ''Notes'' (album), a 1987 album by Paul Bley and Paul Motian * ''Notes'', a common (yet unofficial) shortened version ...
above.)


Example 4

Find a matrix in Jordan normal form that is similar to : A = \begin 0 & 4 & 2 \\ -3 & 8 & 3 \\ 4 & -8 & -2 \end. Solution: The characteristic equation of A is (\lambda - 2)^3 = 0, hence, \lambda = 2 is an eigenvalue of algebraic multiplicity three. Following the procedures of the previous sections, we find that : \operatorname(A - 2I) = 1 and :\operatorname(A - 2I)^2 = 0 = n - \mu . Thus, \rho_2 = 1 and \rho_1 = 2, which implies that a canonical basis for A will contain one linearly independent generalized eigenvector of rank 2 and two linearly independent generalized eigenvectors of rank 1, or equivalently, one chain of two vectors \left\ and one chain of one vector \left\ . Designating M = \begin \mathbf y_1 & \mathbf x_1 & \mathbf x_2 \end , we find that : M = \begin 2 & 2 & 0 \\ 1 & 3 & 0 \\ 0 & -4 & 1 \end, and : J = \begin 2 & 0 & 0 \\ 0 & 2 & 1 \\ 0 & 0 & 2 \end, where M is a generalized modal matrix for A, the columns of M are a canonical basis for A, and AM = MJ. Note that since generalized eigenvectors themselves are not unique, and since some of the columns of both M and J may be interchanged, it follows that both M and J are not unique.


Example 5

In Example 3, we found a canonical basis of linearly independent generalized eigenvectors for a matrix A. A generalized modal matrix for A is : M = \begin \mathbf y_1 & \mathbf x_1 & \mathbf x_2 & \mathbf x_3 \end = \begin -14 & 2 & -2 & 0 \\ 4 & 0 & 2 & 0 \\ -3 & 0 & 0 & 1 \\ 1 & 0 & 0 & 0 \end. A matrix in Jordan normal form, similar to A is :J = \begin 4 & 0 & 0 & 0 \\ 0 & 5 & 1 & 0 \\ 0 & 0 & 5 & 1 \\ 0 & 0 & 0 & 5 \end, so that AM = MJ.


Applications


Matrix functions

Three of the most fundamental operations which can be performed on
square matrices In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often ...
are matrix addition, multiplication by a scalar, and matrix multiplication. These are exactly those operations necessary for defining a
polynomial In mathematics, a polynomial is an expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An example ...
function of an ''n'' × ''n'' matrix A. If we recall from basic
calculus Calculus, originally called infinitesimal calculus or "the calculus of infinitesimals", is the mathematical study of continuous change, in the same way that geometry is the study of shape, and algebra is the study of generalizations of arithm ...
that many functions can be written as a
Maclaurin series Maclaurin or MacLaurin is a surname. Notable people with the surname include: * Colin Maclaurin (1698–1746), Scottish mathematician * Normand MacLaurin (1835–1914), Australian politician and university administrator * Henry Normand MacLaurin ...
, then we can define more general functions of matrices quite easily. If A is diagonalizable, that is : D = M^AM , with : D = \begin \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n \end, then : D^k = \begin \lambda_1^k & 0 & \cdots & 0 \\ 0 & \lambda_2^k & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n^k \end and the evaluation of the Maclaurin series for functions of A is greatly simplified. For example, to obtain any power ''k'' of A, we need only compute D^k, premultiply D^k by M, and postmultiply the result by M^. Using generalized eigenvectors, we can obtain the Jordan normal form for A and these results can be generalized to a straightforward method for computing functions of nondiagonalizable matrices. (See Matrix function#Jordan decomposition.)


Differential equations

Consider the problem of solving the system of linear ordinary differential equations where : \mathbf x = \begin x_1(t) \\ x_2(t) \\ \vdots \\ x_n(t) \end, \quad \mathbf x' = \begin x_1'(t) \\ x_2'(t) \\ \vdots \\ x_n'(t) \end, and A = (a_) . If the matrix A is a diagonal matrix so that a_ = 0 for i \ne j, then the system () reduces to a system of ''n'' equations which take the form In this case, the general solution is given by : x_1 = k_1 e^ : x_2 = k_2 e^ :: \vdots : x_n = k_n e^ . In the general case, we try to diagonalize A and reduce the system () to a system like () as follows. If A is diagonalizable, we have D = M^AM , where M is a modal matrix for A. Substituting A = MDM^ , equation () takes the form M^ \mathbf x' = D(M^ \mathbf x) , or where The solution of () is : y_1 = k_1 e^ : y_2 = k_2 e^ :: \vdots : y_n = k_n e^ . The solution \mathbf x of () is then obtained using the relation (). On the other hand, if A is not diagonalizable, we choose M to be a generalized modal matrix for A, such that J = M^AM is the Jordan normal form of A. The system \mathbf y' = J \mathbf y has the form where the \lambda_i are the eigenvalues from the main diagonal of J and the \epsilon_i are the ones and zeros from the superdiagonal of J. The system () is often more easily solved than (). We may solve the last equation in () for y_n, obtaining y_n = k_n e^ . We then substitute this solution for y_n into the next to last equation in () and solve for y_. Continuing this procedure, we work through () from the last equation to the first, solving the entire system for \mathbf y . The solution \mathbf x is then obtained using the relation (). Lemma: Given the following chain of generalized eigenvector of length r as: : X_1 = v_1e^ : X_2 = (tv_1+v_2)e^ : X_3 = \left(\fracv_1+tv_2+v_3\right)e^ : \vdots : X_r = \left(\fracv_1+...+\fracv_+v_r\right)e^ These functions solve the system of equations: : X' = AX Proof: let define the following sum: :X_j(t)=e^\sum_^j\frac v_i Then: :X'_j(t)=e^\sum_^j\fracv_i+e^\lambda\sum_^j\fracv_i on the other hand we have: :AX_j(t)=e^\sum_^j\fracAv_i :=e^\sum_^j\frac(v_+\lambda v_i) :=e^\sum_^j\fracv_+e^\lambda\sum_^j\fracv_i :=e^\sum_^j\fracv_+e^\lambda\sum_^j\fracv_i :=X'_j(t) as required.


Notes


References

* * * * * * * * * * * * {{Areas of mathematics, collapsed Linear algebra Matrix theory