In
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as:
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as:
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matric ...
, two
matrices and
are said to commute if
, or equivalently if their
commutator
In mathematics, the commutator gives an indication of the extent to which a certain binary operation fails to be commutative. There are different definitions used in group theory and ring theory.
Group theory
The commutator of two elements, ...
is zero. A
set of matrices
is said to commute if they commute pairwise, meaning that every pair of matrices in the set commute with each other.
Characterizations and properties
* Commuting matrices preserve each other's
eigenspaces. As a consequence, commuting matrices over an
algebraically closed field are
simultaneously triangularizable; that is, there are
bases over which they are both
upper triangular. In other words, if
commute, there exists a similarity matrix
such that
is upper triangular for all
. The
converse is not necessarily true, as the following counterexample shows:
*:
: However, if the square of the commutator of two matrices is zero, that is,
, then the converse is true.
* Two diagonalizable matrices
and
commute (
) if they are
simultaneously diagonalizable
In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique.) ...
(that is, there exists an invertible matrix
such that both
and
are
diagonal
In geometry, a diagonal is a line segment joining two vertices of a polygon or polyhedron, when those vertices are not on the same edge. Informally, any sloping line is called diagonal. The word ''diagonal'' derives from the ancient Gree ...
). Converse is valid, provided that one of the matrices has no multiple eigenvalues.
* If
and
commute, they have a common eigenvector. If
has distinct eigenvalues, and
and
commute, then
's eigenvectors are
's eigenvectors.
* If one of the matrices has the property that its minimal polynomial coincides with its
characteristic polynomial
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The ...
(that is, it has the maximal degree), which happens in particular whenever the characteristic polynomial has only
simple roots, then the other matrix can be written as a polynomial in the first.
* As a direct consequence of simultaneous triangulizability, the
eigenvalue
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denot ...
s of two commuting
complex matrices ''A'', ''B'' with their
algebraic multiplicities (the
multiset
In mathematics, a multiset (or bag, or mset) is a modification of the concept of a set that, unlike a set, allows for multiple instances for each of its elements. The number of instances given for each element is called the multiplicity of that ...
s of roots of their characteristic polynomials) can be matched up as
in such a way that the multiset of eigenvalues of any polynomial
in the two matrices is the multiset of the values
. This theorem is due to
Frobenius.
* Two
Hermitian matrices commute if their
eigenspaces coincide. In particular, two Hermitian matrices without multiple eigenvalues commute if they share the same set of eigenvectors. This follows by considering the eigenvalue decompositions of both matrices. Let
and
be two Hermitian matrices.
and
have common eigenspaces when they can be written as
and
. It then follows that
*:
* The property of two matrices commuting is not
transitive: A matrix
may commute with both
and
, and still
and
do not commute with each other. As an example, the
identity matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere.
Terminology and notation
The identity matrix is often denoted by I_n, or simply by I if the size is immaterial ...
commutes with all matrices, which between them do not all commute. If the set of matrices considered is restricted to Hermitian matrices without multiple eigenvalues, then commutativity is transitive, as a consequence of the characterization in terms of eigenvectors.
*
Lie's theorem, which shows that any
representation
Representation may refer to:
Law and politics
*Representation (politics), political activities undertaken by elected representatives, as well as other theories
** Representative democracy, type of democracy in which elected officials represent a ...
of a
solvable Lie algebra is simultaneously upper triangularizable may be viewed as a generalization.
* An ''n'' × ''n'' matrix
commutes with every other ''n'' × ''n'' matrix if and only if it is a scalar matrix, that is, a matrix of the form
, where
is the ''n'' × ''n'' identity matrix and
is a scalar. In other words, the
center of the
group of ''n'' × ''n'' matrices under multiplication is the
subgroup
In group theory, a branch of mathematics, given a group ''G'' under a binary operation ∗, a subset ''H'' of ''G'' is called a subgroup of ''G'' if ''H'' also forms a group under the operation ∗. More precisely, ''H'' is a subgrou ...
of scalar matrices.
Examples
* The identity matrix commutes with all matrices.
*
Jordan blocks commute with upper triangular matrices that have the same value along bands.
* If the product of two
symmetric matrices
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,
Because equal matrices have equal dimensions, only square matrices can be symmetric.
The entries of a symmetric matrix are symmetric with r ...
is symmetric, then they must commute. That also means that every diagonal matrix commutes with all other diagonal matrices.
*
Circulant matrices commute. They form a
commutative ring since the sum of two circulant matrices is circulant.
History
The notion of commuting matrices was introduced by
Cayley in his memoir on the theory of matrices, which also provided the first axiomatization of matrices. The first significant results proved on them was the above result of
Frobenius in 1878.
References
{{reflist
Matrix theory