In
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as:
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as:
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matrices ...
, a tridiagonal matrix is a
band matrix
Band or BAND may refer to:
Places
* Bánd, a village in Hungary
*Band, Iran, a village in Urmia County, West Azerbaijan Province, Iran
* Band, Mureș, a commune in Romania
*Band-e Majid Khan, a village in Bukan County, West Azerbaijan Province, ...
that has nonzero elements only on the
main diagonal
In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, major diagonal, or good diagonal) of a matrix A is the list of entries a_ where i = j. All off-diagonal elements are zero in a diagonal matri ...
, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main diagonal).
For example, the following
matrix is
tridiagonal:
:
The
determinant
In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if a ...
of a tridiagonal matrix is given by the ''
continuant'' of its elements.
An
orthogonal transformation In linear algebra, an orthogonal transformation is a linear transformation ''T'' : ''V'' → ''V'' on a real inner product space ''V'', that preserves the inner product. That is, for each pair of elements of ''V'', we h ...
of a symmetric (or Hermitian) matrix to tridiagonal form can be done with the
Lanczos algorithm
The Lanczos algorithm is an iterative method devised by Cornelius Lanczos that is an adaptation of power methods to find the m "most useful" (tending towards extreme highest/lowest) eigenvalues and eigenvectors of an n \times n Hermitian matri ...
.
Properties
A tridiagonal matrix is a matrix that is both upper and lower
Hessenberg matrix
In linear algebra, a Hessenberg matrix is a special kind of square matrix, one that is "almost" triangular. To be exact, an upper Hessenberg matrix has zero entries below the first subdiagonal, and a lower Hessenberg matrix has zero entries above ...
. In particular, a tridiagonal matrix is a
direct sum of ''p'' 1-by-1 and ''q'' 2-by-2 matrices such that — the dimension of the tridiagonal. Although a general tridiagonal matrix is not necessarily
symmetric
Symmetry (from grc, συμμετρία "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, "symmetry" has a more precise definiti ...
or
Hermitian {{Short description, none
Numerous things are named after the French mathematician Charles Hermite (1822–1901):
Hermite
* Cubic Hermite spline, a type of third-degree spline
* Gauss–Hermite quadrature, an extension of Gaussian quadrature m ...
, many of those that arise when solving linear algebra problems have one of these properties. Furthermore, if a real tridiagonal matrix ''A'' satisfies ''a''
''k'',''k''+1 ''a''
''k''+1,''k'' > 0 for all ''k'', so that the signs of its entries are symmetric, then it is
similar to a Hermitian matrix, by a diagonal change of basis matrix. Hence, its
eigenvalue
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
s are real. If we replace the strict inequality by ''a''
''k'',''k''+1 ''a''
''k''+1,''k'' ≥ 0, then by continuity, the eigenvalues are still guaranteed to be real, but the matrix need no longer be similar to a Hermitian matrix.
The
set of all ''n × n'' tridiagonal matrices forms a ''3n-2''
dimensional vector space
In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called '' vectors'', may be added together and multiplied ("scaled") by numbers called ''scalars''. Scalars are often real numbers, but can ...
.
Many linear algebra
algorithm
In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing ...
s require significantly less
computational effort when applied to diagonal matrices, and this improvement often carries over to tridiagonal matrices as well.
Determinant
The
determinant
In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if a ...
of a tridiagonal matrix ''A'' of order ''n'' can be computed from a three-term
recurrence relation
In mathematics, a recurrence relation is an equation according to which the nth term of a sequence of numbers is equal to some combination of the previous terms. Often, only k previous terms of the sequence appear in the equation, for a parameter ...
. Write ''f''
1 = , ''a''
1, = ''a''
1 (i.e., ''f''
1 is the determinant of the 1 by 1 matrix consisting only of ''a''
1), and let
:
The sequence (''f''
''i'') is called the
continuant and satisfies the recurrence relation
:
with initial values ''f''
0 = 1 and ''f''
−1 = 0. The cost of computing the determinant of a tridiagonal matrix using this formula is linear in ''n'', while the cost is cubic for a general matrix.
Inversion
The
inverse of a non-singular tridiagonal matrix ''T''
:
is given by
:
where the ''θ
i'' satisfy the recurrence relation
:
with initial conditions ''θ''
0 = 1, ''θ''
1 = ''a''
1 and the ''ϕ''
''i'' satisfy
:
with initial conditions ''ϕ''
''n''+1 = 1 and ''ϕ''
''n'' = ''a
n''.
Closed form solutions can be computed for special cases such as
symmetric matrices
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,
Because equal matrices have equal dimensions, only square matrices can be symmetric.
The entries of a symmetric matrix are symmetric with re ...
with all diagonal and off-diagonal elements equal or
Toeplitz matrices and for the general case as well.
In general, the inverse of a tridiagonal matrix is a
semiseparable matrix and vice versa.
Solution of linear system
A system of equations ''Ax'' = ''b'' for
can be solved by an efficient form of Gaussian elimination when ''A'' is tridiagonal called
tridiagonal matrix algorithm
In numerical linear algebra, the tridiagonal matrix algorithm, also known as the Thomas algorithm (named after Llewellyn Thomas), is a simplified form of Gaussian elimination that can be used to solve tridiagonal systems of equations. A tridiagon ...
, requiring ''O''(''n'') operations.
Eigenvalues
When a tridiagonal matrix is also
Toeplitz, there is a simple closed-form solution for its eigenvalues, namely:
:
A real
symmetric
Symmetry (from grc, συμμετρία "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, "symmetry" has a more precise definiti ...
tridiagonal matrix has real eigenvalues, and all the eigenvalues are
distinct (simple) if all off-diagonal elements are nonzero. Numerous methods exist for the numerical computation of the eigenvalues of a real symmetric tridiagonal matrix to arbitrary finite precision, typically requiring
operations for a matrix of size
, although fast algorithms exist which (without parallel computation) require only
.
As a side note, an ''unreduced'' symmetric tridiagonal matrix is a matrix containing non-zero off-diagonal elements of the tridiagonal, where the eigenvalues are distinct while the eigenvectors are unique up to a scale factor and are mutually orthogonal.
Similarity to symmetric tridiagonal matrix
For ''unsymmetric'' or ''nonsymmetric'' tridiagonal matrices one can compute the eigendecomposition using a similarity transformation.
Given a real tridiagonal, nonsymmetric matrix
:
where
.
Assume that each product of off-diagonal entries is positive
and define a transformation matrix
by
:
The
similarity transformation yields a ''symmetric'' tridiagonal matrix
by:
:
Note that
and
have the same eigenvalues.
Computer programming
A transformation that reduces a general matrix to Hessenberg form will reduce a Hermitian matrix to tridiagonal form. So, many
eigenvalue algorithm
In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors.
Eigenvalues and eigenvectors
Given an square ...
s, when applied to a Hermitian matrix, reduce the input Hermitian matrix to (symmetric real) tridiagonal form as a first step.
A tridiagonal matrix can also be stored more efficiently than a general matrix by using a special
storage scheme. For instance, the
LAPACK
LAPACK ("Linear Algebra Package") is a standard software library for numerical linear algebra. It provides routines for solving systems of linear equations and linear least squares, eigenvalue problems, and singular value decomposition. It als ...
Fortran package stores an unsymmetric tridiagonal matrix of order ''n'' in three one-dimensional arrays, one of length ''n'' containing the diagonal elements, and two of length ''n'' − 1 containing the
subdiagonal and
superdiagonal elements.
Applications
The discretization in space of the one-dimensional diffusion or
heat equation
:
using second order central
finite differences
A finite difference is a mathematical expression of the form . If a finite difference is divided by , one gets a difference quotient. The approximation of derivatives by finite differences plays a central role in finite difference methods for the ...
results in
:
with discretization constant
. The matrix is tridiagonal with
and
. Note: no boundary conditions are used here.
See also
*
Pentadiagonal matrix
*
Jacobi matrix (operator)
A Jacobi operator, also known as Jacobi matrix, is a symmetric linear operator acting on sequences which is given by an infinite tridiagonal matrix. It is commonly used to specify systems of orthonormal polynomials over a finite, positive Borel me ...
Notes
External links
Tridiagonal and Bidiagonal Matricesin the LAPACK manual.
*
High performance algorithmsfor reduction to condensed (Hessenberg, tridiagonal, bidiagonal) form
Tridiagonal linear system solverin C++
{{Matrix classes
Sparse matrices