Square Matrix
In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often used to represent simple linear transformations, such as shearing or rotation. For example, if R is a square matrix representing a rotation (rotation matrix) and \mathbf is a column vector describing the position of a point in space, the product R\mathbf yields another column vector describing the position of that point after that rotation. If \mathbf is a row vector, the same transformation can be obtained using where R^ is the transpose of Main diagonal The entries a_ (''i'' = 1, …, ''n'') form the main diagonal of a square matrix. They lie on the imaginary line which runs from the top left corner to the bottom right corner of the matrix. For instance, the main diagonal of the 4×4 matrix above contains the elements , , , . The d ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Arbitrary Square Matrix
Arbitrariness is the quality of being "determined by chance, whim, or impulse, and not by necessity, reason, or principle". It is also used to refer to a choice made without any specific criterion or restraint. Arbitrary decisions are not necessarily the same as random decisions. For example, during the 1973 oil crisis, Americans were allowed to purchase gasoline only on odd-numbered days if their license plate was odd, and on even-numbered days if their license plate was even. The system was well-defined and not random in its restrictions; however, since license plate numbers are completely unrelated to a person's fitness to purchase gasoline, it was still an arbitrary division of people. Similarly, schoolchildren are often organized by their surname in alphabetical order, a non-random yet an arbitrary method—at least in cases where surnames are irrelevant. Philosophy Arbitrary actions are closely related to teleology, the study of purpose. Actions lacking a ''telos (philos ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Identity Matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial or can be trivially determined by the context. I_1 = \begin 1 \end ,\ I_2 = \begin 1 & 0 \\ 0 & 1 \end ,\ I_3 = \begin 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end ,\ \dots ,\ I_n = \begin 1 & 0 & 0 & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \end. The term unit matrix has also been widely used, but the term ''identity matrix'' is now standard. The term ''unit matrix'' is ambiguous, because it is also used for a matrix of ones and for any unit of the ring of all n\times n matrices. In some fields, such as group theory or quantum mechanics, the identity matrix is sometimes denoted by a boldface one, \mathbf, or called "id" (short for identity). ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Indefinite Matrix
In mathematics, a symmetric matrix M with real entries is positive-definite if the real number z^\textsfMz is positive for every nonzero real column vector z, where z^\textsf is the transpose of More generally, a Hermitian matrix (that is, a complex matrix equal to its conjugate transpose) is positive-definite if the real number z^* Mz is positive for every nonzero complex column vector z, where z^* denotes the conjugate transpose of z. Positive semi-definite matrices are defined similarly, except that the scalars z^\textsfMz and z^* Mz are required to be positive ''or zero'' (that is, nonnegative). Negative-definite and negative semi-definite matrices are defined analogously. A matrix that is not positive semi-definite and not negative semi-definite is sometimes called indefinite. A matrix is thus positive-definite if and only if it is the matrix of a positive-definite quadratic form or Hermitian form. In other words, a matrix is positive-definite if and only if it defines a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Positive Definite Matrix
In mathematics, a symmetric matrix M with real entries is positive-definite if the real number z^\textsfMz is positive for every nonzero real column vector z, where z^\textsf is the transpose of More generally, a Hermitian matrix (that is, a complex matrix equal to its conjugate transpose) is positive-definite if the real number z^* Mz is positive for every nonzero complex column vector z, where z^* denotes the conjugate transpose of z. Positive semi-definite matrices are defined similarly, except that the scalars z^\textsfMz and z^* Mz are required to be positive ''or zero'' (that is, nonnegative). Negative-definite and negative semi-definite matrices are defined analogously. A matrix that is not positive semi-definite and not negative semi-definite is sometimes called indefinite. A matrix is thus positive-definite if and only if it is the matrix of a positive-definite quadratic form or Hermitian form. In other words, a matrix is positive-definite if and only if it defines a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenbasis
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by \lambda, is the factor by which the eigenvector is scaled. Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated. Formal definition If is a linear transformation from a vector space over a field into itself and is a nonzero vector in , then is an eigenvector of if is a scalar multiple of . This can be written as T(\mathbf) = \lambda \mathbf, where is a scalar in , known as the eigenvalue, characteristic value, or characteristic root ass ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Spectral Theorem
In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix (mathematics), matrix can be Diagonalizable matrix, diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modeled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective. Examples of operators to which the spectral theorem appl ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Skew-Hermitian Matrix
__NOTOC__ In linear algebra, a square matrix with complex entries is said to be skew-Hermitian or anti-Hermitian if its conjugate transpose is the negative of the original matrix. That is, the matrix A is skew-Hermitian if it satisfies the relation where A^\textsf denotes the conjugate transpose of the matrix A. In component form, this means that for all indices i and j, where a_ is the element in the j-th row and i-th column of A, and the overline denotes complex conjugation. Skew-Hermitian matrices can be understood as the complex versions of real skew-symmetric matrices, or as the matrix analogue of the purely imaginary numbers., §4.1.2 The set of all skew-Hermitian n \times n matrices forms the u(n) Lie algebra, which corresponds to the Lie group U(n). The concept can be generalized to include linear transformations of any complex vector space with a sesquilinear norm. Note that the adjoint of an operator depends on the scalar product considered on the n dimensional c ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Hermitian Matrix
In mathematics, a Hermitian matrix (or self-adjoint matrix) is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the -th row and -th column is equal to the complex conjugate of the element in the -th row and -th column, for all indices and : or in matrix form: A \text \quad \iff \quad A = \overline . Hermitian matrices can be understood as the complex extension of real symmetric matrices. If the conjugate transpose of a matrix A is denoted by A^\mathsf, then the Hermitian property can be written concisely as Hermitian matrices are named after Charles Hermite, who demonstrated in 1855 that matrices of this form share a property with real symmetric matrices of always having real eigenvalues. Other, equivalent notations in common use are A^\mathsf = A^\dagger = A^\ast, although note that in quantum mechanics, A^\ast typically means the complex conjugate only, and not the conjugate transpose. Alternative characterizations Hermit ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Complex Conjugate
In mathematics, the complex conjugate of a complex number is the number with an equal real part and an imaginary part equal in magnitude but opposite in sign. That is, (if a and b are real, then) the complex conjugate of a + bi is equal to a - bi. The complex conjugate of z is often denoted as \overline or z^*. In polar form, the conjugate of r e^ is r e^. This can be shown using Euler's formula. The product of a complex number and its conjugate is a real number: a^2 + b^2 (or r^2 in polar coordinates). If a root of a univariate polynomial with real coefficients is complex, then its complex conjugate is also a root. Notation The complex conjugate of a complex number z is written as \overline z or z^*. The first notation, a vinculum, avoids confusion with the notation for the conjugate transpose of a matrix, which can be thought of as a generalization of the complex conjugate. The second is preferred in physics, where dagger (†) is used for the conjugate tra ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Conjugate Transpose
In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \boldsymbol is an n \times m matrix obtained by transposing \boldsymbol and applying complex conjugate on each entry (the complex conjugate of a+ib being a-ib, for real numbers a and b). It is often denoted as \boldsymbol^\mathrm or \boldsymbol^* or \boldsymbol'. H. W. Turnbull, A. C. Aitken, "An Introduction to the Theory of Canonical Matrices," 1932. For real matrices, the conjugate transpose is just the transpose, \boldsymbol^\mathrm = \boldsymbol^\mathsf. Definition The conjugate transpose of an m \times n matrix \boldsymbol is formally defined by where the subscript ij denotes the (i,j)-th entry, for 1 \le i \le n and 1 \le j \le m, and the overbar denotes a scalar complex conjugate. This definition can also be written as :\boldsymbol^\mathrm = \left(\overline\right)^\mathsf = \overline where \boldsymbol^\mathsf denotes the transpose and \overline denotes the ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Skew-symmetric Matrix
In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition In terms of the entries of the matrix, if a_ denotes the entry in the i-th row and j-th column, then the skew-symmetric condition is equivalent to Example The matrix :A = \begin 0 & 2 & -45 \\ -2 & 0 & -4 \\ 45 & 4 & 0 \end is skew-symmetric because : -A = \begin 0 & -2 & 45 \\ 2 & 0 & 4 \\ -45 & -4 & 0 \end = A^\textsf . Properties Throughout, we assume that all matrix entries belong to a field \mathbb whose characteristic is not equal to 2. That is, we assume that , where 1 denotes the multiplicative identity and 0 the additive identity of the given field. If the characteristic of the field is 2, then a skew-symmetric matrix is the same thing as a symmetric matrix. * The sum of two skew-symmetric matrices is skew-symmetric. * A scala ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |