Exchange Matrix
In mathematics, especially linear algebra, the exchange matrices (also called the reversal matrix, backward identity, or standard involutory permutation) are special cases of permutation matrices, where the 1 elements reside on the antidiagonal and all other elements are zero. In other words, they are 'row-reversed' or 'column-reversed' versions of the identity matrix.. \begin J_2 &= \begin 0 & 1 \\ 1 & 0 \end \\ pt J_3 &= \begin 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \end \\ &\quad \vdots \\ pt J_n &= \begin 0 & 0 & \cdots & 0 & 1 \\ 0 & 0 & \cdots & 1 & 0 \\ \vdots & \vdots & \,_ \!\, ^ \! \dot\phantom & \vdots & \vdots \\ 0 & 1 & \cdots & 0 & 0 \\ 1 & 0 & \cdots & 0 & 0 \end \end Definition If is an exchange matrix, then the elements of are J_ = \begin 1, & i + j = n + 1 \\ 0, & i + j \ne n + 1\\ \end Properties * Premultipl ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Mathematics
Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many areas of mathematics, which include number theory (the study of numbers), algebra (the study of formulas and related structures), geometry (the study of shapes and spaces that contain them), Mathematical analysis, analysis (the study of continuous changes), and set theory (presently used as a foundation for all mathematics). Mathematics involves the description and manipulation of mathematical object, abstract objects that consist of either abstraction (mathematics), abstractions from nature orin modern mathematicspurely abstract entities that are stipulated to have certain properties, called axioms. Mathematics uses pure reason to proof (mathematics), prove properties of objects, a ''proof'' consisting of a succession of applications of in ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Characteristic Polynomial
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any basis (that is, the characteristic polynomial does not depend on the choice of a basis). The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero. In spectral graph theory, the characteristic polynomial of a graph is the characteristic polynomial of its adjacency matrix. Motivation In linear algebra, eigenvalues and eigenvectors play a fundamental role, since, given a linear transformation, an eigenvector is a vector whose direction is not changed by the transformation, and the correspondi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Bisymmetric Matrix
In mathematics, a bisymmetric matrix is a square matrix that is symmetric about both of its main diagonals. More precisely, an matrix is bisymmetric if it satisfies both (it is its own transpose), and , where is the exchange matrix. For example, any matrix of the form \begin a & b & c & d & e \\ b & f & g & h & d \\ c & g & i & g & c \\ d & h & g & f & b \\ e & d & c & b & a \end = \begin a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \end is bisymmetric. The associated 5\times 5 exchange matrix for this example is J_ = \begin 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 \\ 1 & 0 & 0 & 0 & 0 \end Properties *Bisymmetric matrices are both symmetric centrosymmetric and symmetric persymmetric. *The product of two bisymmetric matrices is a centrosymmetric matrix. * Real-valued bisymmetric matrices are precisely those symmetric matrices whose eigenvalues rema ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Persymmetric Matrix
In mathematics, persymmetric matrix may refer to: # a square matrix which is symmetric with respect to the northeast-to-southwest diagonal (anti-diagonal); or # a square matrix such that the values on each line perpendicular to the main diagonal are the same for a given line. The first definition is the most common in the recent literature. The designation "Hankel matrix" is often used for matrices satisfying the property in the second definition. Definition 1 Let be an matrix. The first definition of ''persymmetric'' requires that a_ = a_ for all .. See page 193. For example, 5 × 5 persymmetric matrices are of the form A = \begin a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \end. This can be equivalently expressed as where is the exchange matrix. A third way to express this is seen by post-multiplying with on both sides, showing that rotated 180 degrees is identical t ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Centrosymmetric Matrix
In mathematics, especially in linear algebra and Matrix (mathematics), matrix theory, a centrosymmetric matrix is a matrix (mathematics), matrix which is symmetric about its center. Formal definition An matrix is centrosymmetric when its entries satisfy A_ = A_ \quad \texti,j \in \. Alternatively, if denotes the exchange matrix with 1 on the antidiagonal and 0 elsewhere: J_ = \begin 1, & i + j = n + 1 \\ 0, & i + j \ne n + 1\\ \end then a matrix is centrosymmetric if and only if . Examples * All 2 × 2 centrosymmetric matrices have the form \begin a & b \\ b & a \end. * All 3 × 3 centrosymmetric matrices have the form \begin a & b & c \\ d & e & d \\ c & b & a \end. * Symmetric matrix, Symmetric Toeplitz matrix, Toeplitz matrices are centrosymmetric. Algebraic structure and properties *If and are centrosymmetric matrices over a field (mathematics), field , then so are and for any in . Moreover, the matrix product is centrosymmetr ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Anti-diagonal Matrix
In mathematics, an anti-diagonal matrix is a square matrix where all the entries are zero except those on the diagonal going from the lower left corner to the upper right corner (↗), known as the anti-diagonal (sometimes Harrison diagonal, secondary diagonal, trailing diagonal, minor diagonal, off diagonal or bad diagonal). Formal definition An -by- matrix is an anti-diagonal matrix if the th element is zero for all rows and columns whose indices do not sum to . Symbolically: a_ = 0 \ \forall i,j \in \left\,\ (i+j \ne n+1). Example An example of an anti-diagonal matrix is \begin 0 & 0 & 0 & 0 & 2 \\ 0 & 0 & 0 & 2 & 0 \\ 0 & 0 & 5 & 0 & 0 \\ 0 & 7 & 0 & 0 & 0 \\ -1 & 0 & 0 & 0 & 0 \end. Another example would be \begin 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 \\ 1 & 0 & 0 & 0 & 0 \end ...which can be used to reverse the elements of an array (as a column matrix) by multiplying on the left. Propert ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Permutation
In mathematics, a permutation of a set can mean one of two different things: * an arrangement of its members in a sequence or linear order, or * the act or process of changing the linear order of an ordered set. An example of the first meaning is the six permutations (orderings) of the set : written as tuples, they are (1, 2, 3), (1, 3, 2), (2, 1, 3), (2, 3, 1), (3, 1, 2), and (3, 2, 1). Anagrams of a word whose letters are all different are also permutations: the letters are already ordered in the original word, and the anagram reorders them. The study of permutations of finite sets is an important topic in combinatorics and group theory. Permutations are used in almost every branch of mathematics and in many other fields of science. In computer science, they are used for analyzing sorting algorithms; in quantum physics, for describing states of particles; and in biology, for describing RNA sequences. The number of permutations of distinct objects is factorial, us ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Parity Of A Permutation
In mathematics, when ''X'' is a finite set with at least two elements, the permutations of ''X'' (i.e. the bijective functions from ''X'' to ''X'') fall into two classes of equal size: the even permutations and the odd permutations. If any total ordering of ''X'' is fixed, the parity (oddness or evenness) of a permutation \sigma of ''X'' can be defined as the parity of the number of inversions for ''σ'', i.e., of pairs of elements ''x'', ''y'' of ''X'' such that and . The sign, signature, or signum of a permutation ''σ'' is denoted sgn(''σ'') and defined as +1 if ''σ'' is even and −1 if ''σ'' is odd. The signature defines the alternating character of the symmetric group S''n''. Another notation for the sign of a permutation is given by the more general Levi-Civita symbol (''ε''''σ''), which is defined for all maps from ''X'' to ''X'', and has value zero for non-bijective maps. The sign of a permutation can be explicitly expressed as : where ''N''('' ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Adjugate Matrix
In linear algebra, the adjugate or classical adjoint of a square matrix , , is the transpose of its cofactor matrix. It is occasionally known as adjunct matrix, or "adjoint", though that normally refers to a different concept, the adjoint operator which for a matrix is the conjugate transpose. The product of a matrix with its adjugate gives a diagonal matrix (entries not on the main diagonal are zero) whose diagonal entries are the determinant of the original matrix: :\mathbf \operatorname(\mathbf) = \det(\mathbf) \mathbf, where is the identity matrix of the same size as . Consequently, the multiplicative inverse of an invertible matrix can be found by dividing its adjugate by its determinant. Definition The adjugate of is the transpose of the cofactor matrix of , :\operatorname(\mathbf) = \mathbf^\mathsf. In more detail, suppose is a ( unital) commutative ring and is an matrix with entries from . The -'' minor'' of , denoted , is the determinant of the matrix that ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenvalues And Eigenvectors
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a constant factor \lambda when the linear transformation is applied to it: T\mathbf v=\lambda \mathbf v. The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor \lambda (possibly a negative or complex number). Geometrically, vectors are multi-dimensional quantities with magnitude and direction, often pictured as arrows. A linear transformation rotates, stretches, or shears the vectors upon which it acts. A linear transformation's eigenvectors are those vectors that are only stretched or shrunk, with neither rotation nor shear. The corresponding eigenvalue is the factor by which an eigenvector is stretched or shrunk. If the eigenvalue is negative, the eigenvector's direction is reversed. Th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Modular Arithmetic
In mathematics, modular arithmetic is a system of arithmetic operations for integers, other than the usual ones from elementary arithmetic, where numbers "wrap around" when reaching a certain value, called the modulus. The modern approach to modular arithmetic was developed by Carl Friedrich Gauss in his book '' Disquisitiones Arithmeticae'', published in 1801. A familiar example of modular arithmetic is the hour hand on a 12-hour clock. If the hour hand points to 7 now, then 8 hours later it will point to 3. Ordinary addition would result in , but 15 reads as 3 on the clock face. This is because the hour hand makes one rotation every 12 hours and the hour number starts over when the hour hand passes 12. We say that 15 is ''congruent'' to 3 modulo 12, written 15 ≡ 3 (mod 12), so that 7 + 8 ≡ 3 (mod 12). Similarly, if one starts at 12 and waits 8 hours, the hour hand will be at 8. If one instead waited twice as long, 16 hours, the hour hand would be on 4. This ca ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathematics), matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as line (geometry), lines, plane (geometry), planes and rotation (mathematics), rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to Space of functions, function spaces. Linear algebra is also used in most sciences and fields of engineering because it allows mathematical model, modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |