HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrice ...
, a Hessenberg matrix is a special kind of
square matrix In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often ...
, one that is "almost" triangular. To be exact, an upper Hessenberg matrix has zero entries below the first
subdiagonal In geometry, a diagonal is a line segment joining two vertices of a polygon or polyhedron, when those vertices are not on the same edge. Informally, any sloping line is called diagonal. The word ''diagonal'' derives from the ancient Gre ...
, and a lower Hessenberg matrix has zero entries above the first
superdiagonal In geometry, a diagonal is a line segment joining two vertices of a polygon or polyhedron, when those vertices are not on the same edge. Informally, any sloping line is called diagonal. The word ''diagonal'' derives from the ancient Gre ...
. They are named after Karl Hessenberg.


Definitions


Upper Hessenberg matrix

A square n \times n matrix A is said to be in upper Hessenberg form or to be an upper Hessenberg matrix if a_=0 for all i,j with i > j+1. An upper Hessenberg matrix is called unreduced if all subdiagonal entries are nonzero, i.e. if a_ \neq 0 for all i \in \.


Lower Hessenberg matrix

A square n \times n matrix A is said to be in lower Hessenberg form or to be a lower Hessenberg matrix if its transpose is an upper Hessenberg matrix or equivalently if a_=0 for all i,j with j > i+1. A lower Hessenberg matrix is called unreduced if all superdiagonal entries are nonzero, i.e. if a_ \neq 0 for all i \in \.


Examples

Consider the following matrices. :A=\begin 1 & 4 & 2 & 3 \\ 3 & 4 & 1 & 7 \\ 0 & 2 & 3 & 4 \\ 0 & 0 & 1 & 3 \\ \end :B=\begin 1 & 2 & 0 & 0 \\ 5 & 2 & 3 & 0 \\ 3 & 4 & 3 & 7 \\ 5 & 6 & 1 & 1 \\ \end :C=\begin 1 & 2 & 0 & 0 \\ 5 & 2 & 0 & 0 \\ 3 & 4 & 3 & 7 \\ 5 & 6 & 1 & 1 \\ \end The matrix A is an upper unreduced Hessenberg matrix, B is a lower unreduced Hessenberg matrix and C is a lower Hessenberg matrix but is not unreduced.


Computer programming

Many linear algebra
algorithm In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing ...
s require significantly less computational effort when applied to triangular matrices, and this improvement often carries over to Hessenberg matrices as well. If the constraints of a linear algebra problem do not allow a general matrix to be conveniently reduced to a triangular one, reduction to Hessenberg form is often the next best thing. In fact, reduction of any matrix to a Hessenberg form can be achieved in a finite number of steps (for example, through Householder's transformation of unitary similarity transforms). Subsequent reduction of Hessenberg matrix to a triangular matrix can be achieved through iterative procedures, such as shifted QR-factorization. In eigenvalue algorithms, the Hessenberg matrix can be further reduced to a triangular matrix through Shifted QR-factorization combined with deflation steps. Reducing a general matrix to a Hessenberg matrix and then reducing further to a triangular matrix, instead of directly reducing a general matrix to a triangular matrix, often economizes the arithmetic involved in the QR algorithm for eigenvalue problems.


Reduction to Hessenberg matrix

Any n \times n matrix can be transformed into a Hessenberg matrix by a similarity transformation using Householder transformations. The following procedure for such a transformation is adapted from A Second Course In Linear Algebra by ''Garcia & Roger''. Let A be any real or complex n \times n matrix, then let A^\prime be the (n - 1) \times n submatrix of A constructed by removing the first row in A and let \mathbf^\prime_1 be the first column of A^\prime. Construct the (n-1) \times (n-1) householder matrix V_1 = I_ - 2\frac where w = \begin , , \mathbf^\prime_1, , _2\mathbf_1 - \mathbf^\prime_1 \;\;\;\;\;\;\;\; , \;\;\; a^\prime_ = 0 \\ , , \mathbf^\prime_1, , _2\mathbf_1 + \frac\mathbf \;\;\; , \;\;\; a^\prime_ \neq 0 \\ \end This householder matrix will map \mathbf^\prime_1 to , , \mathbf^\prime_1, , \mathbf_1 and as such, the block matrix U_1 = \begin1 & \mathbf \\ \mathbf & V_1 \end will map the matrix A to the matrix U_1A which has only zeros below the second entry of the first column. Now construct (n-2) \times (n-2) householder matrix V_2 in a similar manner as V_1 such that V_2 maps the first column of A^ to , , \mathbf^_1, , \mathbf_1, where A^ is the submatrix of A^ constructed by removing the first row and the first column of A^, then let U_2 = \begin1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & V_2\end which maps U_1A to the matrix U_2U_1A which has only zeros below the first and second entry of the subdiagonal. Now construct V_3 and then U_3 in a similar manner, but for the matrix A^ constructed by removing the first row and first column of A^ and proceed as in the previous steps. Continue like this for a total of n-2 steps. Realising that the first k rows of any n \times n matrix is invariant under multiplication by U_k^* from the right, by construction of U_k, and so, any matrix can be transformed to an upper Hessenberg matrix by a similarity transformation of the form U_( \dots (U_2(U_1AU_1^*)U_2^*) \dots )U_^* = U_ \dots U_2U_1A(U_ \dots U_2U_1)^* = UAU^*.


Properties

For n \in \ , it is vacuously true that every n \times n matrix is both upper Hessenberg, and lower Hessenberg. The product of a Hessenberg matrix with a triangular matrix is again Hessenberg. More precisely, if A is upper Hessenberg and T is upper triangular, then AT and TA are upper Hessenberg. A matrix that is both upper Hessenberg and lower Hessenberg is a
tridiagonal matrix In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main ...
, of which symmetric or Hermitian Hessenberg matrices are important examples. A Hermitian matrix can be reduced to tri-diagonal real symmetric matrices.


Hessenberg operator

The Hessenberg operator is an infinite dimensional Hessenberg matrix. It commonly occurs as the generalization of the Jacobi operator to a system of orthogonal polynomials for the space of square-integrable holomorphic functions over some domain—that is, a
Bergman space In complex analysis, functional analysis and operator theory, a Bergman space, named after Stefan Bergman, is a function space of holomorphic functions in a domain ''D'' of the complex plane that are sufficiently well-behaved at the boundary that t ...
. In this case, the Hessenberg operator is the right- shift operator S, given by : fz)=zf(z). The
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denote ...
s of each principal submatrix of the Hessenberg operator are given by the
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The c ...
for that submatrix. These polynomials are called the
Bergman polynomial Bergman is a surname of German, Swedish, Dutch and Yiddish origin meaning 'mountain man', or sometimes (only in German) 'miner'.https://www.ancestry.com/name-origin?surname=bergmann People *Alan Bergman (born 1925), American songwriter *Alan Be ...
s, and provide an orthogonal polynomial basis for Bergman space.


See also

*
Hessenberg variety In geometry, Hessenberg varieties, first studied by Filippo De Mari, Claudio Procesi, and Mark A. Shayman, are a family of subvarieties of the full flag variety which are defined by a Hessenberg function ''h'' and a linear transformation ''X' ...


Notes


References

* . * . *


External links


Hessenberg matrix
at MathWorld. *
High performance algorithms
for reduction to condensed (Hessenberg, tridiagonal, bidiagonal) form {{DEFAULTSORT:Hessenberg Matrix Matrices