In
mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, Hadamard's inequality (also known as Hadamard's theorem on determinants) is a result first published by
Jacques Hadamard in 1893.
[Maz'ya & Shaposhnikova] It is a bound on the
determinant of a
matrix whose entries are
complex numbers in terms of the lengths of its column vectors. In geometrical terms, when restricted to real numbers, it bounds the
volume in
Euclidean space of ''n'' dimensions marked out by ''n'' vectors ''v
i'' for 1 ≤ ''i'' ≤ ''n'' in terms of the lengths of these vectors , , ''v
i'', , .
Specifically, Hadamard's inequality states that if ''N'' is the matrix having columns ''v
i'', then
:
If the n vectors are non-zero, equality in Hadamard's inequality is achieved if and only if the vectors are
orthogonal
In mathematics, orthogonality is the generalization of the geometric notion of ''perpendicularity''.
By extension, orthogonality is also used to refer to the separation of specific features of a system. The term also has specialized meanings in ...
.
Alternate forms and corollaries
A corollary is that if the entries of an ''n'' by ''n'' matrix ''N'' are bounded by ''B'', so , ''N
ij'', ≤''B'' for all ''i'' and ''j'', then
:
In particular, if the entries of ''N'' are +1 and −1 only then
:
In
combinatorics
Combinatorics is an area of mathematics primarily concerned with counting, both as a means and an end in obtaining results, and certain properties of finite structures. It is closely related to many other areas of mathematics and has many appl ...
, matrices ''N'' for which equality holds, i.e. those with orthogonal columns, are called
Hadamard matrices.
A
positive-semidefinite matrix ''P'' can be written as ''N''
*''N'', where ''N''
* denotes the
conjugate transpose of ''N'' (see
Decomposition of a semidefinite matrix). Then
:
So, the determinant of a
positive definite matrix is less than or equal to the product of its diagonal entries. Sometimes this is also known as Hadamard's inequality.
[
]
Proof
The result is trivial if the matrix N is singular, so assume the columns of N are linearly independent. By dividing each column by its length, it can be seen that the result is equivalent to the special case where each column has length 1, in other words if ''ei'' are unit vectors and ''M'' is the matrix having the ''ei'' as columns then
and equality is achieved if and only if the vectors are an orthogonal set
In mathematics, particularly linear algebra, an orthonormal basis for an inner product space ''V'' with finite dimension is a basis for V whose vectors are orthonormal, that is, they are all unit vectors and orthogonal to each other. For example, ...
. The general result now follows:
:
To prove , consider ''P'' =''M*M'' and let the eigenvalues of ''P'' be λ1, λ2, … λ''n''. Since the length of each column of ''M'' is 1, each entry in the diagonal of ''P'' is 1, so the trace of ''P'' is ''n''. Applying the inequality of arithmetic and geometric means,
:
so
:
If there is equality then each of the ''λ''''i'''s must all be equal and their sum is ''n'', so they must all be 1. The matrix ''P'' is Hermitian, therefore diagonalizable, so it is the identity matrix—in other words the columns of ''M'' are an orthonormal set and the columns of ''N'' are an orthogonal set.[Proof follows, with minor modifications, the second proof given in Maz'ya & Shaposhnikova.] Many other proofs can be found in the literature.
See also
*Fischer's inequality
In mathematics, Fischer's inequality gives an upper bound for the determinant of a positive-semidefinite matrix whose entries are complex numbers in terms of the determinants of its principal diagonal blocks.
Suppose ''A'', ''C'' are respective ...
Notes
References
*
*
*
*
Further reading
*
{{DEFAULTSORT:Hadamard's Inequality
Inequalities
Determinants