Exchange Matrix
   HOME
*





Exchange Matrix
In mathematics, especially linear algebra, the exchange matrices (also called the reversal matrix, backward identity, or standard involutory permutation) are special cases of permutation matrix, permutation matrices, where the 1 elements reside on the Main_diagonal#Antidiagonal, antidiagonal and all other elements are zero. In other words, they are 'row-reversed' or 'column-reversed' versions of the identity matrix.. : J_=\begin 0 & 1 \\ 1 & 0 \end;\quad J_ = \begin 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \end; \quad J_ = \begin 0 & 0 & \cdots & 0 & 0 & 1 \\ 0 & 0 & \cdots & 0 & 1 & 0 \\ 0 & 0 & \cdots & 1 & 0 & 0 \\ \vdots & \vdots & & \vdots & \vdots & \vdots \\ 0 & 1 & \cdots & 0 & 0 & 0 \\ 1 & 0 & \cdots & 0 & 0 & 0 \end. Definition If ''J'' is an ''n'' × ''n'' exchange matrix, then the elements of ''J'' are J_ = \begin 1, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Determinant
In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and only if the matrix is invertible and the linear map represented by the matrix is an isomorphism. The determinant of a product of matrices is the product of their determinants (the preceding property is a corollary of this one). The determinant of a matrix is denoted , , or . The determinant of a matrix is :\begin a & b\\c & d \end=ad-bc, and the determinant of a matrix is : \begin a & b & c \\ d & e & f \\ g & h & i \end= aei + bfg + cdh - ceg - bdi - afh. The determinant of a matrix can be defined in several equivalent ways. Leibniz formula expresses the determinant as a sum of signed products of matrix entries such that each summand is the product of different entries, and the number of these summands is n!, the factorial of (t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Bisymmetric Matrix
In mathematics, a bisymmetric matrix is a square matrix that is symmetric about both of its main diagonals. More precisely, an ''n'' × ''n'' matrix ''A'' is bisymmetric if it satisfies both ''A'' = ''AT'' and ''AJ'' = ''JA'' where ''J'' is the ''n'' × ''n'' exchange matrix. For example, any matrix of the form :\begin a & b & c & d & e \\ b & f & g & h & d \\ c & g & i & g & c \\ d & h & g & f & b \\ e & d & c & b & a \end is bisymmetric. Properties *Bisymmetric matrices are both symmetric centrosymmetric and symmetric persymmetric. *The product of two bisymmetric matrices is a centrosymmetric matrix. *Real-valued bisymmetric matrices are precisely those symmetric matrices whose eigenvalues remain the same aside from possible sign changes following pre- or post-multiplication by the exchange matrix. *If ''A'' is a real bisymmetric matrix with distinct eigenvalues, then the matrices that commute Commute, commutation or commutative may refer to: * Co ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Persymmetric Matrix
In mathematics, persymmetric matrix may refer to: # a square matrix which is symmetric with respect to the northeast-to-southwest diagonal; or # a square matrix such that the values on each line perpendicular to the main diagonal are the same for a given line. The first definition is the most common in the recent literature. The designation "Hankel matrix" is often used for matrices satisfying the property in the second definition. Definition 1 Let ''A'' = (''a''''ij'') be an ''n'' × ''n'' matrix. The first definition of ''persymmetric'' requires that :a_ = a_ for all ''i'', ''j''.. See page 193. For example, 5 × 5 persymmetric matrices are of the form :A = \begin a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \end. This can be equivalently expressed as ''AJ'' = ''JA''T where ''J'' is the exchange matrix. A symmetric matrix is a matrix whose values are symmetric in the northwest-to-southeast diag ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Centrosymmetric Matrix
In mathematics, especially in linear algebra and matrix theory, a centrosymmetric matrix is a matrix which is symmetric about its center. More precisely, an ''n''×''n'' matrix ''A'' = 'A''''i'',''j''is centrosymmetric when its entries satisfy :''A''''i'',''j'' = ''A''''n''−''i'' + 1,''n''−''j'' + 1 for ''i'', ''j'' ∊. If ''J'' denotes the ''n''×''n'' exchange matrix with 1 on the antidiagonal and 0 elsewhere (that is, ''J''''i'',''n'' + 1 − ''i'' = 1; ''J''''i'',''j'' = 0 if ''j'' ≠ ''n'' +1− ''i''), then a matrix ''A'' is centrosymmetric if and only if ''AJ'' = ''JA''. Examples * All 2×2 centrosymmetric matrices have the form \begin a & b \\ b & a \end. * All 3×3 centrosymmetric matrices have the form \begin a & b & c \\ d & e & d \\ c & b & a \end. * Symmetric Toeplitz matrices are centrosymmetric. Algebraic structure and properties *If ''A'' and ''B'' are centrosymmetric matrices over a field ''F'', then so are ''A'' + ''B'' and ''cA'' for any ''c'' in '' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Anti-diagonal Matrix
In mathematics, an anti-diagonal matrix is a square matrix where all the entries are zero except those on the diagonal going from the lower left corner to the upper right corner (↗), known as the anti-diagonal (sometimes Harrison diagonal, secondary diagonal, trailing diagonal, minor diagonal, or bad diagonal). Formal definition An ''n''-by-''n'' matrix ''A'' is an anti-diagonal matrix if the (''i'', ''j'') element is zero \forall i,j \in \left\ (i+j \ne n+1). Example An example of an anti-diagonal matrix is : \begin 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 2 & 0 \\ 0 & 0 & 5 & 0 & 0 \\ 0 & 7 & 0 & 0 & 0 \\ -1 & 0 & 0 & 0 & 0 \end. Properties All anti-diagonal matrices are also persymmetric. The product of two anti-diagonal matrices is a diagonal matrix. Furthermore, the product of an anti-diagonal matrix with a diagonal matrix is anti-diagonal, as is the product of a diagonal matrix with an anti-diagonal matrix. An anti-diagonal matrix is invertible if and only if the entries on ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Adjugate Matrix
In linear algebra, the adjugate or classical adjoint of a square matrix is the transpose of its cofactor matrix and is denoted by . It is also occasionally known as adjunct matrix, or "adjoint", though the latter today normally refers to a different concept, the adjoint operator which is the conjugate transpose of the matrix. The product of a matrix with its adjugate gives a diagonal matrix (entries not on the main diagonal are zero) whose diagonal entries are the determinant of the original matrix: :\mathbf \operatorname(\mathbf) = \det(\mathbf) \mathbf, where is the identity matrix of the same size as . Consequently, the multiplicative inverse of an invertible matrix can be found by dividing its adjugate by its determinant. Definition The adjugate of is the transpose of the cofactor matrix of , :\operatorname(\mathbf) = \mathbf^\mathsf. In more detail, suppose is a unital commutative ring and is an matrix with entries from . The -''minor'' of , denoted , is the determ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Characteristic Polynomial
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base (that is, the characteristic polynomial does not depend on the choice of a basis). The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero. In spectral graph theory, the characteristic polynomial of a graph is the characteristic polynomial of its adjacency matrix. Motivation In linear algebra, eigenvalues and eigenvectors play a fundamental role, since, given a linear transformation, an eigenvector is a vector whose direction is not changed by the transformation, and the corresponding eigenva ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Modular Arithmetic
In mathematics, modular arithmetic is a system of arithmetic for integers, where numbers "wrap around" when reaching a certain value, called the modulus. The modern approach to modular arithmetic was developed by Carl Friedrich Gauss in his book ''Disquisitiones Arithmeticae'', published in 1801. A familiar use of modular arithmetic is in the 12-hour clock, in which the day is divided into two 12-hour periods. If the time is 7:00 now, then 8 hours later it will be 3:00. Simple addition would result in , but clocks "wrap around" every 12 hours. Because the hour number starts over at zero when it reaches 12, this is arithmetic ''modulo'' 12. In terms of the definition below, 15 is ''congruent'' to 3 modulo 12, so "15:00" on a 24-hour clock is displayed "3:00" on a 12-hour clock. Congruence Given an integer , called a modulus, two integers and are said to be congruent modulo , if is a divisor of their difference (that is, if there is an integer such that ). Congruence modulo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Trace (linear Algebra)
In linear algebra, the trace of a square matrix , denoted , is defined to be the sum of elements on the main diagonal (from the upper left to the lower right) of . The trace is only defined for a square matrix (). It can be proved that the trace of a matrix is the sum of its (complex) eigenvalues (counted with multiplicities). It can also be proved that for any two matrices and . This implies that similar matrices have the same trace. As a consequence one can define the trace of a linear operator mapping a finite-dimensional vector space into itself, since all matrices describing such an operator with respect to a basis are similar. The trace is related to the derivative of the determinant (see Jacobi's formula). Definition The trace of an square matrix is defined as \operatorname(\mathbf) = \sum_^n a_ = a_ + a_ + \dots + a_ where denotes the entry on the th row and th column of . The entries of can be real numbers or (more generally) complex numbers. The trace is not de ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as lines, planes and rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to spaces of functions. Linear algebra is also used in most sciences and fields of engineering, because it allows modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order approximations, using the fact that the differential of a multivariate function at a point is the linear ma ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Involutory Matrix
In mathematics, an involutory matrix is a square matrix that is its own inverse. That is, multiplication by the matrix A is an involution if and only if A2 = I, where I is the ''n'' × ''n'' identity matrix. Involutory matrices are all square roots of the identity matrix. This is simply a consequence of the fact that any nonsingular matrix multiplied by its inverse is the identity.. Examples The 2 × 2 real matrix \begina & b \\ c & -a \end is involutory provided that a^2 + bc = 1 . The Pauli matrices in M(2, C) are involutory: \begin \sigma_1 = \sigma_x &= \begin 0 & 1 \\ 1 & 0 \end, \\ \sigma_2 = \sigma_y &= \begin 0 & -i \\ i & 0 \end, \\ \sigma_3 = \sigma_z &= \begin 1 & 0 \\ 0 & -1 \end. \end One of the three classes of elementary matrix is involutory, namely the row-interchange elementary matrix. A special case of another class of elementary matrix, that which repre ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]