Skew-Hermitian Matrix
__NOTOC__ In linear algebra, a square matrix with complex entries is said to be skew-Hermitian or anti-Hermitian if its conjugate transpose is the negative of the original matrix. That is, the matrix A is skew-Hermitian if it satisfies the relation where A^\textsf denotes the conjugate transpose of the matrix A. In component form, this means that for all indices i and j, where a_ is the element in the i-th row and j-th column of A, and the overline denotes complex conjugation. Skew-Hermitian matrices can be understood as the complex versions of real skew-symmetric matrices, or as the matrix analogue of the purely imaginary numbers., §4.1.2 The set of all skew-Hermitian n \times n matrices forms the u(n) Lie algebra, which corresponds to the Lie group U(n). The concept can be generalized to include linear transformations of any complex vector space with a sesquilinear norm. Note that the adjoint of an operator depends on the scalar product considered on the n dimension ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathematics), matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as line (geometry), lines, plane (geometry), planes and rotation (mathematics), rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to Space of functions, function spaces. Linear algebra is also used in most sciences and fields of engineering because it allows mathematical model, modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Self-adjoint
In mathematics, an element of a *-algebra is called self-adjoint if it is the same as its adjoint (i.e. a = a^*). Definition Let \mathcal be a *-algebra. An element a \in \mathcal is called self-adjoint if The set of self-adjoint elements is referred to as A subset \mathcal \subseteq \mathcal that is closed under the involution *, i.e. \mathcal = \mathcal^*, is called A special case of particular importance is the case where \mathcal is a complete normed *-algebra, that satisfies the C*-identity (\left\, a^*a \right\, = \left\, a \right\, ^2 \ \forall a \in \mathcal), which is called a C*-algebra. Especially in the older literature on *-algebras and C*-algebras, such elements are often called Because of that the notations \mathcal_h, \mathcal_H or H(\mathcal) for the set of self-adjoint elements are also sometimes used, even in the more recent literature. Examples * Each positive element of a C*-algebra is * For each element a of a *-algebra, the elements a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Normal Matrix
In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose : :A \text \iff A^*A = AA^* . The concept of normal matrices can be extended to normal operators on infinite-dimensional normed spaces and to normal elements in C*-algebras. As in the matrix case, normality means commutativity is preserved, to the extent possible, in the noncommutative setting. This makes normal operators, and normal elements of C*-algebras, more amenable to analysis. The spectral theorem states that a matrix is normal if and only if it is unitarily similar to a diagonal matrix, and therefore any matrix satisfying the equation is diagonalizable. (The converse does not hold because diagonalizable matrices may have non-orthogonal eigenspaces.) Thus A = U D U^* and A^* = U D^* U^*where D is a diagonal matrix whose diagonal values are in general complex. The left and right singular vectors in the singular value decomposition of a normal matrix A = U D V^* dif ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Bivector (complex)
In mathematics, a bivector is the vector part of a biquaternion. For biquaternion , ''w'' is called the biscalar and is its bivector part. The coordinates ''w'', ''x'', ''y'', ''z'' are complex numbers with imaginary unit h: :x = x_1 + \mathrm x_2,\ y = y_1 + \mathrm y_2,\ z = z_1 + \mathrm z_2, \quad \mathrm^2 = -1 = \mathrm^2 = \mathrm^2 = \mathrm^2 . A bivector may be written as the sum of real and imaginary parts: :(x_1 \mathrm + y_1 \mathrm + z_1 \mathrm) + \mathrm (x_2 \mathrm + y_2 \mathrm + z_2 \mathrm) where r_1 = x_1 \mathrm + y_1 \mathrm + z_1 \mathrm and r_2 = x_2 \mathrm + y_2 \mathrm + z_2 \mathrm are vectors. Thus the bivector q = x \mathrm + y \mathrm + z \mathrm = r_1 + \mathrm r_2 . Link from David R. Wilkins collection at Trinity College, Dublin The Lie algebra of the Lorentz group is expressed by bivectors. In particular, if ''r''1 and ''r''2 are right versors so that r_1^2 = -1 = r_2^2, then the biquaternion curve traces over and over the unit circle in th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Commutator
In mathematics, the commutator gives an indication of the extent to which a certain binary operation fails to be commutative. There are different definitions used in group theory and ring theory. Group theory The commutator of two elements, and , of a group , is the element : . This element is equal to the group's identity if and only if and commute (that is, if and only if ). The set of all commutators of a group is not in general closed under the group operation, but the subgroup of ''G'' generated by all commutators is closed and is called the ''derived group'' or the '' commutator subgroup'' of ''G''. Commutators are used to define nilpotent and solvable groups and the largest abelian quotient group. The definition of the commutator above is used throughout this article, but many group theorists define the commutator as : . Using the first definition, this can be expressed as . Identities (group theory) Commutator identities are an important tool in group th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Lie Group
In mathematics, a Lie group (pronounced ) is a group (mathematics), group that is also a differentiable manifold, such that group multiplication and taking inverses are both differentiable. A manifold is a space that locally resembles Euclidean space, whereas groups define the abstract concept of a binary operation along with the additional properties it must have to be thought of as a "transformation" in the abstract sense, for instance multiplication and the taking of inverses (to allow division), or equivalently, the concept of addition and subtraction. Combining these two ideas, one obtains a continuous group where multiplying points and their inverses is continuous. If the multiplication and taking of inverses are smoothness, smooth (differentiable) as well, one obtains a Lie group. Lie groups provide a natural model for the concept of continuous symmetry, a celebrated example of which is the circle group. Rotating a circle is an example of a continuous symmetry. For an ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Unitary Matrix
In linear algebra, an invertible complex square matrix is unitary if its matrix inverse equals its conjugate transpose , that is, if U^* U = UU^* = I, where is the identity matrix. In physics, especially in quantum mechanics, the conjugate transpose is referred to as the Hermitian adjoint of a matrix and is denoted by a dagger (), so the equation above is written U^\dagger U = UU^\dagger = I. A complex matrix is special unitary if it is unitary and its matrix determinant equals . For real numbers, the analogue of a unitary matrix is an orthogonal matrix. Unitary matrices have significant importance in quantum mechanics because they preserve norms, and thus, probability amplitudes. Properties For any unitary matrix of finite size, the following hold: * Given two complex vectors and , multiplication by preserves their inner product; that is, . * is normal (U^* U = UU^*). * is diagonalizable; that is, is unitarily similar to a diagonal matrix, as a consequence of ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Matrix Exponential
In mathematics, the matrix exponential is a matrix function on square matrix, square matrices analogous to the ordinary exponential function. It is used to solve systems of linear differential equations. In the theory of Lie groups, the matrix exponential gives the exponential map (Lie theory), exponential map between a matrix Lie algebra and the corresponding Lie group. Let be an real number, real or complex number, complex matrix (mathematics), matrix. The exponential of , denoted by or , is the matrix given by the power series e^X = \sum_^\infty \frac X^k where X^0 is defined to be the identity matrix I with the same dimensions as X, and . The series always converges, so the exponential of is well-defined. Equivalently, e^X = \lim_ \left(I + \frac \right)^k for integer-valued , where is the identity matrix. Equivalently, given by the solution to the differential equation \frac d e^ = X e^, \quad e^ = I When is an diagonal matrix then will be an diagonal matr ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Symmetric Matrix
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if a_ denotes the entry in the ith row and jth column then for all indices i and j. Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative. In linear algebra, a real symmetric matrix represents a self-adjoint operator represented in an orthonormal basis over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Skew-symmetric Matrix
In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition In terms of the entries of the matrix, if a_ denotes the entry in the i-th row and j-th column, then the skew-symmetric condition is equivalent to Example The matrix A = \begin 0 & 2 & -45 \\ -2 & 0 & -4 \\ 45 & 4 & 0 \end is skew-symmetric because A^\textsf = \begin 0 & -2 & 45 \\ 2 & 0 & 4 \\ -45 & -4 & 0 \end = -A . Properties Throughout, we assume that all matrix entries belong to a field \mathbb whose characteristic is not equal to 2. That is, we assume that , where 1 denotes the multiplicative identity and 0 the additive identity of the given field. If the characteristic of the field is 2, then a skew-symmetric matrix is the same thing as a symmetric matrix. * The sum of two skew-symmetric matrices is skew-symmetric. * A scalar ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Hermitian Matrix
In mathematics, a Hermitian matrix (or self-adjoint matrix) is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the -th row and -th column is equal to the complex conjugate of the element in the -th row and -th column, for all indices and : A \text \quad \iff \quad a_ = \overline or in matrix form: A \text \quad \iff \quad A = \overline . Hermitian matrices can be understood as the complex extension of real symmetric matrices. If the conjugate transpose of a matrix A is denoted by A^\mathsf, then the Hermitian property can be written concisely as A \text \quad \iff \quad A = A^\mathsf Hermitian matrices are named after Charles Hermite, who demonstrated in 1855 that matrices of this form share a property with real symmetric matrices of always having real eigenvalues. Other, equivalent notations in common use are A^\mathsf = A^\dagger = A^\ast, although in quantum mechanics, A^\ast typically means the complex conjugate onl ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Scalar (mathematics)
A scalar is an element of a field which is used to define a ''vector space''. In linear algebra, real numbers or generally elements of a field are called scalars and relate to vectors in an associated vector space through the operation of scalar multiplication (defined in the vector space), in which a vector can be multiplied by a scalar in the defined way to produce another vector. Generally speaking, a vector space may be defined by using any field instead of real numbers (such as complex numbers). Then scalars of that vector space will be elements of the associated field (such as complex numbers). A scalar product operation – not to be confused with scalar multiplication – may be defined on a vector space, allowing two vectors to be multiplied in the defined way to produce a scalar. A vector space equipped with a scalar product is called an inner product space. A quantity described by multiple scalars, such as having both direction and magnitude, is called a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |