Anti-diagonal Matrix
   HOME
*





Anti-diagonal Matrix
In mathematics, an anti-diagonal matrix is a square matrix where all the entries are zero except those on the diagonal going from the lower left corner to the upper right corner (↗), known as the anti-diagonal (sometimes Harrison diagonal, secondary diagonal, trailing diagonal, minor diagonal, or bad diagonal). Formal definition An ''n''-by-''n'' matrix ''A'' is an anti-diagonal matrix if the (''i'', ''j'') element is zero \forall i,j \in \left\ (i+j \ne n+1). Example An example of an anti-diagonal matrix is : \begin 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 2 & 0 \\ 0 & 0 & 5 & 0 & 0 \\ 0 & 7 & 0 & 0 & 0 \\ -1 & 0 & 0 & 0 & 0 \end. Properties All anti-diagonal matrices are also persymmetric. The product of two anti-diagonal matrices is a diagonal matrix. Furthermore, the product of an anti-diagonal matrix with a diagonal matrix is anti-diagonal, as is the product of a diagonal matrix with an anti-diagonal matrix. An anti-diagonal matrix is invertible if and only if the entries on ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Square Matrix
In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often used to represent simple linear transformations, such as shearing or rotation. For example, if R is a square matrix representing a rotation (rotation matrix) and \mathbf is a column vector describing the position of a point in space, the product R\mathbf yields another column vector describing the position of that point after that rotation. If \mathbf is a row vector, the same transformation can be obtained using where R^ is the transpose of Main diagonal The entries a_ (''i'' = 1, …, ''n'') form the main diagonal of a square matrix. They lie on the imaginary line which runs from the top left corner to the bottom right corner of the matrix. For instance, the main diagonal of the 4×4 matrix above contains the elements , , , . The d ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Persymmetric Matrix
In mathematics, persymmetric matrix may refer to: # a square matrix which is symmetric with respect to the northeast-to-southwest diagonal; or # a square matrix such that the values on each line perpendicular to the main diagonal are the same for a given line. The first definition is the most common in the recent literature. The designation "Hankel matrix" is often used for matrices satisfying the property in the second definition. Definition 1 Let ''A'' = (''a''''ij'') be an ''n'' × ''n'' matrix. The first definition of ''persymmetric'' requires that :a_ = a_ for all ''i'', ''j''.. See page 193. For example, 5 × 5 persymmetric matrices are of the form :A = \begin a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \\ a_ & a_ & a_ & a_ & a_ \end. This can be equivalently expressed as ''AJ'' = ''JA''T where ''J'' is the exchange matrix. A symmetric matrix is a matrix whose values are symmetric in the northwest-to-southeast diag ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Diagonal Matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is \left begin 3 & 0 \\ 0 & 2 \end\right/math>, while an example of a 3×3 diagonal matrix is \left begin 6 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end\right/math>. An identity matrix of any size, or any multiple of it (a scalar matrix), is a diagonal matrix. A diagonal matrix is sometimes called a scaling matrix, since matrix multiplication with it results in changing scale (size). Its determinant is the product of its diagonal values. Definition As stated above, a diagonal matrix is a matrix in which all off-diagonal entries are zero. That is, the matrix with ''n'' columns and ''n'' rows is diagonal if \forall i,j \in \, i \ne j \implies d_ = 0. However, the main diagonal entries are unrestricted. The term ''diagonal matrix'' may s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Invertible Matrix
In linear algebra, an -by- square matrix is called invertible (also nonsingular or nondegenerate), if there exists an -by- square matrix such that :\mathbf = \mathbf = \mathbf_n \ where denotes the -by- identity matrix and the multiplication used is ordinary matrix multiplication. If this is the case, then the matrix is uniquely determined by , and is called the (multiplicative) ''inverse'' of , denoted by . Matrix inversion is the process of finding the matrix that satisfies the prior equation for a given invertible matrix . A square matrix that is ''not'' invertible is called singular or degenerate. A square matrix is singular if and only if its determinant is zero. Singular matrices are rare in the sense that if a square matrix's entries are randomly selected from any finite region on the number line or complex plane, the probability that the matrix is singular is 0, that is, it will "almost never" be singular. Non-square matrices (-by- matrices for which ) do not hav ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Determinant
In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and only if the matrix is invertible and the linear map represented by the matrix is an isomorphism. The determinant of a product of matrices is the product of their determinants (the preceding property is a corollary of this one). The determinant of a matrix is denoted , , or . The determinant of a matrix is :\begin a & b\\c & d \end=ad-bc, and the determinant of a matrix is : \begin a & b & c \\ d & e & f \\ g & h & i \end= aei + bfg + cdh - ceg - bdi - afh. The determinant of a matrix can be defined in several equivalent ways. Leibniz formula expresses the determinant as a sum of signed products of matrix entries such that each summand is the product of different entries, and the number of these summands is n!, the factorial of (t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Absolute Value
In mathematics, the absolute value or modulus of a real number x, is the non-negative value without regard to its sign. Namely, , x, =x if is a positive number, and , x, =-x if x is negative (in which case negating x makes -x positive), and For example, the absolute value of 3 and the absolute value of −3 is The absolute value of a number may be thought of as its distance from zero. Generalisations of the absolute value for real numbers occur in a wide variety of mathematical settings. For example, an absolute value is also defined for the complex numbers, the quaternions, ordered rings, fields and vector spaces. The absolute value is closely related to the notions of magnitude, distance, and norm in various mathematical and physical contexts. Terminology and notation In 1806, Jean-Robert Argand introduced the term ''module'', meaning ''unit of measure'' in French, specifically for the ''complex'' absolute value,Oxford English Dictionary, Draft Revision, June 2008 an ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Product (mathematics)
In mathematics, a product is the result of multiplication, or an expression that identifies objects (numbers or variables) to be multiplied, called ''factors''. For example, 30 is the product of 6 and 5 (the result of multiplication), and x\cdot (2+x) is the product of x and (2+x) (indicating that the two factors should be multiplied together). The order in which real or complex numbers are multiplied has no bearing on the product; this is known as the ''commutative law'' of multiplication. When matrices or members of various other associative algebras are multiplied, the product usually depends on the order of the factors. Matrix multiplication, for example, is non-commutative, and so is multiplication in other algebras in general as well. There are many different kinds of products in mathematics: besides being able to multiply just numbers, polynomials or matrices, one can also define products on many different algebraic structures. Product of two numbers Product of a seque ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Permutation
In mathematics, a permutation of a set is, loosely speaking, an arrangement of its members into a sequence or linear order, or if the set is already ordered, a rearrangement of its elements. The word "permutation" also refers to the act or process of changing the linear order of an ordered set. Permutations differ from combinations, which are selections of some members of a set regardless of order. For example, written as tuples, there are six permutations of the set , namely (1, 2, 3), (1, 3, 2), (2, 1, 3), (2, 3, 1), (3, 1, 2), and (3, 2, 1). These are all the possible orderings of this three-element set. Anagrams of words whose letters are different are also permutations: the letters are already ordered in the original word, and the anagram is a reordering of the letters. The study of permutations of finite sets is an important topic in the fields of combinatorics and group theory. Permutations are used in almost every branch of mathematics, and in many other fields of scie ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Triangular Number
A triangular number or triangle number counts objects arranged in an equilateral triangle. Triangular numbers are a type of figurate number, other examples being square numbers and cube numbers. The th triangular number is the number of dots in the triangular arrangement with dots on each side, and is equal to the sum of the natural numbers from 1 to . The sequence of triangular numbers, starting with the 0th triangular number, is (This sequence is included in the On-Line Encyclopedia of Integer Sequences .) Formula The triangular numbers are given by the following explicit formulas: T_n= \sum_^n k = 1+2+3+ \dotsb +n = \frac = , where \textstyle is a binomial coefficient. It represents the number of distinct pairs that can be selected from objects, and it is read aloud as " plus one choose two". The first equation can be illustrated using a visual proof. For every triangular number T_n, imagine a "half-square" arrangement of objects corresponding to the triangular numb ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Main Diagonal
In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, major diagonal, or good diagonal) of a matrix A is the list of entries a_ where i = j. All off-diagonal elements are zero in a diagonal matrix. The following four matrices have their main diagonals indicated by red ones: :\begin \color & 0 & 0\\ 0 & \color & 0\\ 0 & 0 & \color\end \qquad \begin \color & 0 & 0 & 0 \\ 0 & \color & 0 & 0 \\ 0 & 0 & \color & 0 \end \qquad \begin \color & 0 & 0 \\ 0 & \color & 0 \\ 0 & 0 & \color \\ 0 & 0 & 0 \end \qquad \begin \color & 0 & 0 & 0 \\ 0 & \color & 0 & 0 \\ 0 & 0 &\color & 0 \\ 0 & 0 & 0 & \color \end \qquad Antidiagonal The antidiagonal (sometimes counter diagonal, secondary diagonal, trailing diagonal, minor diagonal, off diagonal, or bad diagonal) of an order N square matrix B is the collection of entries b_ such that i + j = N+1 for all 1 \leq i, j \leq N. That is, it runs from the top right corner to the bottom left corner. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Exchange Matrix
In mathematics, especially linear algebra, the exchange matrices (also called the reversal matrix, backward identity, or standard involutory permutation) are special cases of permutation matrix, permutation matrices, where the 1 elements reside on the Main_diagonal#Antidiagonal, antidiagonal and all other elements are zero. In other words, they are 'row-reversed' or 'column-reversed' versions of the identity matrix.. : J_=\begin 0 & 1 \\ 1 & 0 \end;\quad J_ = \begin 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \end; \quad J_ = \begin 0 & 0 & \cdots & 0 & 0 & 1 \\ 0 & 0 & \cdots & 0 & 1 & 0 \\ 0 & 0 & \cdots & 1 & 0 & 0 \\ \vdots & \vdots & & \vdots & \vdots & \vdots \\ 0 & 1 & \cdots & 0 & 0 & 0 \\ 1 & 0 & \cdots & 0 & 0 & 0 \end. Definition If ''J'' is an ''n'' × ''n'' exchange matrix, then the elements of ''J'' are J_ = \begin 1, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]