In
mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, a square matrix is a
matrix
Matrix most commonly refers to:
* ''The Matrix'' (franchise), an American media franchise
** '' The Matrix'', a 1999 science-fiction action film
** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchi ...
with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Square matrices are often used to represent simple
linear transformations, such as
shearing
Sheep shearing is the process by which the woollen fleece of a sheep is cut off. The person who removes the sheep's wool is called a '' shearer''. Typically each adult sheep is shorn once each year (a sheep may be said to have been "shorn" o ...
or
rotation. For example, if
is a square matrix representing a rotation (
rotation matrix In linear algebra, a rotation matrix is a transformation matrix that is used to perform a rotation in Euclidean space. For example, using the convention below, the matrix
:R = \begin
\cos \theta & -\sin \theta \\
\sin \theta & \cos \theta
\ ...
) and
is a
column vector describing the
position of a point in space, the product
yields another column vector describing the position of that point after that rotation. If
is a
row vector, the same transformation can be obtained using where
is the
transpose of
Main diagonal
The entries
(''i'' = 1, …, ''n'') form the
main diagonal of a square matrix. They lie on the imaginary line which runs from the top left corner to the bottom right corner of the matrix. For instance, the main diagonal of the 4×4 matrix above contains the elements , , , .
The diagonal of a square matrix from the top right to the bottom left corner is called ''antidiagonal'' or ''counterdiagonal''.
Special kinds
:
Diagonal or triangular matrix
If all entries outside the main diagonal are zero,
is called a
diagonal matrix. If only all entries above (or below) the main diagonal are zero,
is called an upper (or lower)
triangular matrix.
Identity matrix
The
identity matrix of size
is the
matrix in which all the elements on the
main diagonal are equal to 1 and all other elements are equal to 0, e.g.
:
It is a square matrix of order and also a special kind of
diagonal matrix. It is called identity matrix because multiplication with it leaves a matrix unchanged:
: for any ''m''-by-''n'' matrix
Invertible matrix and its inverse
A square matrix
is called ''
invertible'' or ''non-singular'' if there exists a matrix
such that
:
If
exists, it is unique and is called the ''
inverse matrix'' of denoted
Symmetric or skew-symmetric matrix
A square matrix
that is equal to its transpose, i.e., is a
symmetric matrix. If instead then
is called a
skew-symmetric matrix
In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition
In terms of the entries of the matrix, i ...
.
For a complex square matrix often the appropriate analogue of the transpose is the
conjugate transpose defined as the transpose of the
complex conjugate of A complex square matrix
satisfying
is called a
Hermitian matrix. If instead then
is called a
skew-Hermitian matrix.
By the
spectral theorem
In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful be ...
, real symmetric (or complex Hermitian) matrices have an orthogonal (or unitary)
eigenbasis; i.e., every vector is expressible as a
linear combination of eigenvectors. In both cases, all eigenvalues are real.
Definite matrix
A symmetric ''n''×''n''-matrix is called''
positive-definite'' (respectively negative-definite; indefinite), if for all nonzero vectors
the associated
quadratic form given by
:
''Q''(x) = xT''A''x
takes only positive values (respectively only negative values; both some negative and some positive values). If the quadratic form takes only non-negative (respectively only non-positive) values, the symmetric matrix is called positive-semidefinite (respectively negative-semidefinite); hence the matrix is indefinite precisely when it is neither positive-semidefinite nor negative-semidefinite.
A symmetric matrix is positive-definite if and only if all its eigenvalues are positive. The table at the right shows two possibilities for 2×2 matrices.
Allowing as input two different vectors instead yields the
bilinear form associated to ''A'':
:''B''
''A''(x, y) = x
T''A''y.
Orthogonal matrix
An ''orthogonal matrix'' is a
square matrix with
real entries whose columns and rows are
orthogonal unit vector
In mathematics, a unit vector in a normed vector space is a vector (often a spatial vector) of length 1. A unit vector is often denoted by a lowercase letter with a circumflex, or "hat", as in \hat (pronounced "v-hat").
The term ''direction v ...
s (i.e.,
orthonormal
In linear algebra, two vectors in an inner product space are orthonormal if they are orthogonal (or perpendicular along a line) unit vectors. A set of vectors form an orthonormal set if all vectors in the set are mutually orthogonal and all of ...
vectors). Equivalently, a matrix ''A'' is orthogonal if its
transpose is equal to its
inverse
Inverse or invert may refer to:
Science and mathematics
* Inverse (logic), a type of conditional sentence which is an immediate inference made from another conditional sentence
* Additive inverse (negation), the inverse of a number that, when a ...
:
:
which entails
:
where ''I'' is the
identity matrix.
An orthogonal matrix ''A'' is necessarily
invertible (with inverse ),
unitary
Unitary may refer to:
Mathematics
* Unitary divisor
* Unitary element
* Unitary group
* Unitary matrix
* Unitary morphism
* Unitary operator
* Unitary transformation
* Unitary representation In mathematics, a unitary representation of a grou ...
(), and
normal (). The
determinant
In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if a ...
of any orthogonal matrix is either +1 or −1. The
special orthogonal group consists of the orthogonal matrices with
determinant
In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if a ...
+1.
The
complex analogue of an orthogonal matrix is a
unitary matrix.
Normal matrix
A real or complex square matrix
is called ''
normal'' if If a real square matrix is symmetric, skew-symmetric, or orthogonal, then it is normal. If a complex square matrix is Hermitian, skew-Hermitian, or unitary, then it is normal. Normal matrices are of interest mainly because they include the types of matrices just listed and form the broadest class of matrices for which the
spectral theorem
In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful be ...
holds.
Operations
Trace
The
trace, tr(''A'') of a square matrix ''A'' is the sum of its diagonal entries. While matrix multiplication is not commutative, the trace of the product of two matrices is independent of the order of the factors:
:
This is immediate from the definition of matrix multiplication:
:
Also, the trace of a matrix is equal to that of its transpose, i.e.,
:
Determinant
The ''determinant''
or
of a square matrix
is a number encoding certain properties of the matrix. A matrix is invertible
if and only if
In logic and related fields such as mathematics and philosophy, "if and only if" (shortened as "iff") is a biconditional logical connective between statements, where either both statements are true or both are false.
The connective is bic ...
its determinant is nonzero. Its
absolute value equals the area (in
) or volume (in
) of the image of the unit square (or cube), while its sign corresponds to the orientation of the corresponding linear map: the determinant is positive if and only if the orientation is preserved.
The determinant of 2×2 matrices is given by
:
The determinant of 3×3 matrices involves 6 terms (
rule of Sarrus). The more lengthy
Leibniz formula generalizes these two formulae to all dimensions.
The determinant of a product of square matrices equals the product of their determinants:
:
Adding a multiple of any row to another row, or a multiple of any column to another column, does not change the determinant. Interchanging two rows or two columns affects the determinant by multiplying it by −1. Using these operations, any matrix can be transformed to a lower (or upper) triangular matrix, and for such matrices the determinant equals the product of the entries on the main diagonal; this provides a method to calculate the determinant of any matrix. Finally, the
Laplace expansion expresses the determinant in terms of
minors, i.e., determinants of smaller matrices. This expansion can be used for a recursive definition of determinants (taking as starting case the determinant of a 1×1 matrix, which is its unique entry, or even the determinant of a 0×0 matrix, which is 1), that can be seen to be equivalent to the Leibniz formula. Determinants can be used to solve
linear system
In systems theory, a linear system is a mathematical model of a system based on the use of a linear operator.
Linear systems typically exhibit features and properties that are much simpler than the nonlinear case.
As a mathematical abstraction ...
s using
Cramer's rule
In linear algebra, Cramer's rule is an explicit formula for the solution of a system of linear equations with as many equations as unknowns, valid whenever the system has a unique solution. It expresses the solution in terms of the determinants o ...
, where the division of the determinants of two related square matrices equates to the value of each of the system's variables.
Eigenvalues and eigenvectors
A number λ and a non-zero vector
satisfying
:
are called an ''eigenvalue'' and an ''eigenvector'' of respectively. The number λ is an eigenvalue of an ''n''×''n''-matrix ''A'' if and only if is not invertible, which is
equivalent to
:
The polynomial ''p''
''A'' in an
indeterminate ''X'' given by evaluation of the determinant is called the
characteristic polynomial
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The c ...
of ''A''. It is a
monic polynomial
In algebra, a monic polynomial is a single-variable polynomial (that is, a univariate polynomial) in which the leading coefficient (the nonzero coefficient of highest degree) is equal to 1. Therefore, a monic polynomial has the form:
:x^n+c_x^+\ ...
of
degree ''n''. Therefore the polynomial equation has at most ''n'' different solutions, i.e., eigenvalues of the matrix.
They may be complex even if the entries of ''A'' are real. According to the
Cayley–Hamilton theorem, , that is, the result of substituting the matrix itself into its own characteristic polynomial yields the
zero matrix.
See also
*
Cartan matrix
Notes
References
*
*
*
External links
*
{{linear algebra
*