In

A brief introduction and proof of eigenvalue properties of the real symmetric matrix

How to implement a Symmetric Matrix in C++

{{Authority control Matrices

linear algebra
Linear algebra is the branch of mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. Thes ...

, a symmetric matrix is a square matrix that is equal to its transpose
In linear algebra
Linear algebra is the branch of mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities a ...

. Formally,
Because equal matrices have equal dimensions, only square matrices can be symmetric.
The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if $a\_$ denotes the entry in the $i$th row and $j$th column then
for all indices $i$ and $j.$
Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.
In linear algebra, a real symmetric matrix represents a self-adjoint operator represented in an orthonormal basis over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix
In mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in ...

with complex-valued entries, which is equal to its conjugate transpose
In mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in m ...

. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.
Example

The following $3\; \backslash times\; 3$ matrix is symmetric: $$A\; =\; \backslash begin\; 1\; \&\; 7\; \&\; 3\; \backslash \backslash \; 7\; \&\; 4\; \&\; 5\; \backslash \backslash \; 3\; \&\; 5\; \&\; 1\; \backslash end$$Properties

Basic properties

* The sum and difference of two symmetric matrices is symmetric. * This is not always true for the product: given symmetric matrices $A$ and $B$, then $AB$ is symmetric if and only if $A$ and $B$ commute, i.e., if $AB=BA$. * For any integer $n$, $A^n$ is symmetric if $A$ is symmetric. * If $A^$ exists, it is symmetric if and only if $A$ is symmetric. * Rank of a symmetric matrix $A$ is equal to the number of non-zero eigenvalues of $A$.Decomposition into symmetric and skew-symmetric

Any square matrix can uniquely be written as sum of a symmetric and a skew-symmetric matrix. This decomposition is known as the Toeplitz decomposition. Let $\backslash mbox\_n$ denote the space of $n\; \backslash times\; n$ matrices. If $\backslash mbox\_n$ denotes the space of $n\; \backslash times\; n$ symmetric matrices and $\backslash mbox\_n$ the space of $n\; \backslash times\; n$ skew-symmetric matrices then $\backslash mbox\_n\; =\; \backslash mbox\_n\; +\; \backslash mbox\_n$ and $\backslash mbox\_n\; \backslash cap\; \backslash mbox\_n\; =\; \backslash $, i.e. $$\backslash mbox\_n\; =\; \backslash mbox\_n\; \backslash oplus\; \backslash mbox\_n\; ,$$ where $\backslash oplus$ denotes the direct sum. Let $X\; \backslash in\; \backslash mbox\_n$ then $$X\; =\; \backslash frac\backslash left(X\; +\; X^\backslash textsf\backslash right)\; +\; \backslash frac\backslash left(X\; -\; X^\backslash textsf\backslash right).$$ Notice that $\backslash frac\backslash left(X\; +\; X^\backslash textsf\backslash right)\; \backslash in\; \backslash mbox\_n$ and $\backslash frac\; \backslash left(X\; -\; X^\backslash textsf\backslash right)\; \backslash in\; \backslash mathrm\_n$. This is true for every square matrix $X$ with entries from any field whose characteristic is different from 2. A symmetric $n\; \backslash times\; n$ matrix is determined by $\backslash tfracn(n+1)$ scalars (the number of entries on or above the main diagonal). Similarly, a skew-symmetric matrix is determined by $\backslash tfracn(n-1)$ scalars (the number of entries above the main diagonal).Matrix congruent to a symmetric matrix

Any matrix congruent to a symmetric matrix is again symmetric: if $X$ is a symmetric matrix, then so is $A\; X\; A^$ for any matrix $A$.Symmetry implies normality

A (real-valued) symmetric matrix is necessarily a normal matrix.Real symmetric matrices

Denote by $\backslash langle\; \backslash cdot,\backslash cdot\; \backslash rangle$ the standard inner product on $\backslash mathbb^n$. The real $n\; \backslash times\; n$ matrix $A$ is symmetric if and only if $$\backslash langle\; Ax,\; y\; \backslash rangle\; =\; \backslash langle\; x,\; Ay\; \backslash rangle\; \backslash quad\; \backslash forall\; x,\; y\; \backslash in\; \backslash mathbb^n.$$ Since this definition is independent of the choice of basis, symmetry is a property that depends only on the linear operator A and a choice of inner product. This characterization of symmetry is useful, for example, indifferential geometry
Differential geometry is a mathematical discipline that studies the geometry
Geometry (; ) is, with arithmetic, one of the oldest branches of mathematics
Mathematics is an area of knowledge that includes the topics of numbers, form ...

, for each tangent space to a manifold may be endowed with an inner product, giving rise to what is called a Riemannian manifold. Another area where this formulation is used is in Hilbert space
In mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in ...

s.
The finite-dimensional spectral theorem says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix. More explicitly: For every real symmetric matrix $A$ there exists a real orthogonal matrix $Q$ such that $D\; =\; Q^\; A\; Q$ is a diagonal matrix. Every real symmetric matrix is thus, up to Two mathematical objects ''a'' and ''b'' are called equal up to an equivalence relation ''R''
* if ''a'' and ''b'' are related by ''R'', that is,
* if ''aRb'' holds, that is,
* if the equivalence classes of ''a'' and ''b'' with respect to ''R' ...

choice of an orthonormal basis, a diagonal matrix.
If $A$ and $B$ are $n\; \backslash times\; n$ real symmetric matrices that commute, then they can be simultaneously diagonalized: there exists a basis of $\backslash mathbb^n$ such that every element of the basis is an eigenvector for both $A$ and $B$.
Every real symmetric matrix is Hermitian, and therefore all its eigenvalues
In linear algebra
Linear algebra is the branch of mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities ...

are real. (In fact, the eigenvalues are the entries in the diagonal matrix $D$ (above), and therefore $D$ is uniquely determined by $A$ up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.
Complex symmetric matrices

A complex symmetric matrix can be 'diagonalized' using a unitary matrix: thus if $A$ is a complex symmetric matrix, there is a unitary matrix $U$ such that $U\; A\; U^$ is a real diagonal matrix with non-negative entries. This result is referred to as the Autonne–Takagi factorization. It was originally proved by Léon Autonne (1915) and Teiji Takagi (1925) and rediscovered with different proofs by several other mathematicians. In fact, the matrix $B=A^\; A$ is Hermitian and positive semi-definite, so there is a unitary matrix $V$ such that $V^\; B\; V$ is diagonal with non-negative real entries. Thus $C=V^\; A\; V$ is complex symmetric with $C^C$ real. Writing $C=X+iY$ with $X$ and $Y$ real symmetric matrices, $C^C=X^2+Y^2+i(XY-YX)$. Thus $XY=YX$. Since $X$ and $Y$ commute, there is a real orthogonal matrix $W$ such that both $W\; X\; W^$ and $W\; Y\; W^$ are diagonal. Setting $U=W\; V^$ (a unitary matrix), the matrix $UAU^$ is complex diagonal. Pre-multiplying $U$ by a suitable diagonal unitary matrix (which preserves unitarity of $U$), the diagonal entries of $UAU^$ can be made to be real and non-negative as desired. To construct this matrix, we express the diagonal matrix as $UAU^\backslash mathrm\; T\; =\; \backslash operatorname(r\_1\; e^,r\_2\; e^,\; \backslash dots,\; r\_n\; e^)$. The matrix we seek is simply given by $D\; =\; \backslash operatorname(e^,e^,\; \backslash dots,\; e^)$. Clearly $DUAU^\backslash mathrm\; TD\; =\; \backslash operatorname(r\_1,\; r\_2,\; \backslash dots,\; r\_n)$ as desired, so we make the modification $U\text{'}\; =\; DU$. Since their squares are the eigenvalues of $A^\; A$, they coincide with thesingular value In mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in mode ...

s of $A$. (Note, about the eigen-decomposition of a complex symmetric matrix $A$, the Jordan normal form of $A$ may not be diagonal, therefore $A$ may not be diagonalized by any similarity transformation.)
Decomposition

Using the Jordan normal form, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices. Every real non-singular matrix can be uniquely factored as the product of an orthogonal matrix and a symmetric positive definite matrix, which is called a polar decomposition. Singular matrices can also be factored, but not uniquely. Cholesky decomposition states that every real positive-definite symmetric matrix $A$ is a product of a lower-triangular matrix $L$ and its transpose, $$A\; =\; LL^\backslash textsf.$$ If the matrix is symmetric indefinite, it may be still decomposed as $PAP^\backslash textsf\; =\; LDL^\backslash textsf$ where $P$ is a permutation matrix (arising from the need to pivot), $L$ a lower unit triangular matrix, and $D$ is a direct sum of symmetric $1\; \backslash times\; 1$ and $2\; \backslash times\; 2$ blocks, which is called Bunch–Kaufman decomposition A general (complex) symmetric matrix may be defective and thus not be diagonalizable. If $A$ is diagonalizable it may be decomposed as $$A\; =\; Q\; \backslash Lambda\; Q^\backslash textsf$$ where $Q$ is an orthogonal matrix $Q\; Q^\backslash textsf\; =\; I$, and $\backslash Lambda$ is a diagonal matrix of the eigenvalues of $A$. In the special case that $A$ is real symmetric, then $Q$ and $\backslash Lambda$ are also real. To see orthogonality, suppose $\backslash mathbf\; x$ and $\backslash mathbf\; y$ are eigenvectors corresponding to distinct eigenvalues $\backslash lambda\_1$, $\backslash lambda\_2$. Then $$\backslash lambda\_1\; \backslash langle\; \backslash mathbf\; x,\; \backslash mathbf\; y\; \backslash rangle\; =\; \backslash langle\; A\; \backslash mathbf\; x,\; \backslash mathbf\; y\; \backslash rangle\; =\; \backslash langle\; \backslash mathbf\; x,\; A\; \backslash mathbf\; y\; \backslash rangle\; =\; \backslash lambda\_2\; \backslash langle\; \backslash mathbf\; x,\; \backslash mathbf\; y\; \backslash rangle.$$ Since $\backslash lambda\_1$ and $\backslash lambda\_2$ are distinct, we have $\backslash langle\; \backslash mathbf\; x,\; \backslash mathbf\; y\; \backslash rangle\; =\; 0$.Hessian

Symmetric $n\; \backslash times\; n$ matrices of real functions appear as the Hessians of twice differentiable functions of $n$ real variables (the continuity of the second derivative is not needed, despite common belief to the opposite). Every quadratic form $q$ on $\backslash mathbb^n$ can be uniquely written in the form $q(\backslash mathbf)\; =\; \backslash mathbf^\backslash textsf\; A\; \backslash mathbf$ with a symmetric $n\; \backslash times\; n$ matrix $A$. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of $\backslash R^n$, "looks like" $$q\backslash left(x\_1,\; \backslash ldots,\; x\_n\backslash right)\; =\; \backslash sum\_^n\; \backslash lambda\_i\; x\_i^2$$ with real numbers $\backslash lambda\_i$. This considerably simplifies the study of quadratic forms, as well as the study of the level sets $\backslash left\backslash $ which are generalizations of conic sections. This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of Taylor's theorem.Symmetrizable matrix

An $n\; \backslash times\; n$ matrix $A$ is said to be symmetrizable if there exists an invertible diagonal matrix $D$ and symmetric matrix $S$ such that $A\; =\; DS.$ The transpose of a symmetrizable matrix is symmetrizable, since $A^=(DS)^=SD=D^(DSD)$ and $DSD$ is symmetric. A matrix $A=(a\_)$ is symmetrizable if and only if the following conditions are met: # $a\_\; =\; 0$ implies $a\_\; =\; 0$ for all $1\; \backslash le\; i\; \backslash le\; j\; \backslash le\; n.$ # $a\_\; a\_\; \backslash dots\; a\_\; =\; a\_\; a\_\; \backslash dots\; a\_$ for any finite sequence $\backslash left(i\_1,\; i\_2,\; \backslash dots,\; i\_k\backslash right).$See also

Other types ofsymmetry
Symmetry (from grc, συμμετρία "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics
Mathematics is an area of knowle ...

or pattern in square matrices have special names; see for example:
* Skew-symmetric matrix (also called ''antisymmetric'' or ''antimetric'')
* Centrosymmetric matrix
* Circulant matrix
* Covariance matrix
* Coxeter matrix
* GCD matrix
* Hankel matrix
* Hilbert matrix
* Persymmetric matrix
* Sylvester's law of inertia
* Toeplitz matrix
* Transpositions matrix
See also symmetry in mathematics.
Notes

References

*External links

*A brief introduction and proof of eigenvalue properties of the real symmetric matrix

How to implement a Symmetric Matrix in C++

{{Authority control Matrices