Secular Equation
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base (that is, the characteristic polynomial does not depend on the choice of a basis). The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero. In spectral graph theory, the characteristic polynomial of a graph is the characteristic polynomial of its adjacency matrix. Motivation In linear algebra, eigenvalues and eigenvectors play a fundamental role, since, given a linear transformation, an eigenvector is a vector whose direction is not changed by the transformation, and the corresponding eigen ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as lines, planes and rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to spaces of functions. Linear algebra is also used in most sciences and fields of engineering, because it allows modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order approximations, using the fact that the differential of a multivariate function at a point is the linear ma ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Singular Matrix
In linear algebra, an -by- square matrix is called invertible (also nonsingular or nondegenerate), if there exists an -by- square matrix such that :\mathbf = \mathbf = \mathbf_n \ where denotes the -by- identity matrix and the multiplication used is ordinary matrix multiplication. If this is the case, then the matrix is uniquely determined by , and is called the (multiplicative) ''inverse'' of , denoted by . Matrix inversion is the process of finding the matrix that satisfies the prior equation for a given invertible matrix . A square matrix that is ''not'' invertible is called singular or degenerate. A square matrix is singular if and only if its determinant is zero. Singular matrices are rare in the sense that if a square matrix's entries are randomly selected from any finite region on the number line or complex plane, the probability that the matrix is singular is 0, that is, it will "almost never" be singular. Non-square matrices (-by- matrices for which ) do not hav ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Characteristic (algebra)
In mathematics, the characteristic of a ring (mathematics), ring , often denoted , is defined to be the smallest number of times one must use the ring's identity element, multiplicative identity (1) in a sum to get the additive identity (0). If this sum never reaches the additive identity the ring is said to have characteristic zero. That is, is the smallest positive number such that: :\underbrace_ = 0 if such a number exists, and otherwise. Motivation The special definition of the characteristic zero is motivated by the equivalent definitions characterized in the next section, where the characteristic zero is not required to be considered separately. The characteristic may also be taken to be the exponent (group theory), exponent of the ring's additive group, that is, the smallest positive integer such that: :\underbrace_ = 0 for every element of the ring (again, if exists; otherwise zero). Some authors do not include the multiplicative identity element in their r ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Faddeev–LeVerrier Algorithm
In mathematics (linear algebra), the Faddeev–LeVerrier algorithm is a recursive method to calculate the coefficients of the characteristic polynomial p_A(\lambda)=\det (\lambda I_n - A) of a square matrix, , named after Dmitry Konstantinovich Faddeev and Urbain Le Verrier. Calculation of this polynomial yields the eigenvalues of as its roots; as a matrix polynomial in the matrix itself, it vanishes by the fundamental Cayley–Hamilton theorem. Computing determinant from the definition of characteristic polynomial, however, is computationally cumbersome, because \lambda is new symbolic quantity, whereas this algorithm works directly with coefficients of matrix A. The algorithm has been independently rediscovered several times, in some form or another. It was first published in 1840 by Urbain Le Verrier, subsequently redeveloped by P. Horst, Jean-Marie Souriau, in its present form here by Faddeev and Sominsky, and further by J. S. Frame, and others. (For historical point ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Principal Minor
In linear algebra, a minor of a matrix A is the determinant of some smaller square matrix, cut down from A by removing one or more of its rows and columns. Minors obtained by removing just one row and one column from square matrices (first minors) are required for calculating matrix cofactors, which in turn are useful for computing both the determinant and inverse of square matrices. Definition and illustration First minors If A is a square matrix, then the ''minor'' of the entry in the ''i''th row and ''j''th column (also called the (''i'', ''j'') ''minor'', or a ''first minor'') is the determinant of the submatrix formed by deleting the ''i''th row and ''j''th column. This number is often denoted ''M''''i,j''. The (''i'', ''j'') ''cofactor'' is obtained by multiplying the minor by (-1)^. To illustrate these definitions, consider the following 3 by 3 matrix, :\begin 1 & 4 & 7 \\ 3 & 0 & 5 \\ -1 & 9 & 11 \\ \end To compute the minor ''M''2,3 and the cofactor ''C''2,3, we fin ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Exterior Algebra
In mathematics, the exterior algebra, or Grassmann algebra, named after Hermann Grassmann, is an algebra that uses the exterior product or wedge product as its multiplication. In mathematics, the exterior product or wedge product of vectors is an algebraic construction used in geometry to study areas, volumes, and their higher-dimensional analogues. The exterior product of two vectors u and v, denoted by u \wedge v, is called a bivector and lives in a space called the ''exterior square'', a vector space that is distinct from the original space of vectors. The magnitude of u \wedge v can be interpreted as the area of the parallelogram with sides u and v, which in three dimensions can also be computed using the cross product of the two vectors. More generally, all parallel plane surfaces with the same orientation and area have the same bivector as a measure of their oriented area. Like the cross product, the exterior product is anticommutative, meaning t ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Exterior Algebra
In mathematics, the exterior algebra, or Grassmann algebra, named after Hermann Grassmann, is an algebra that uses the exterior product or wedge product as its multiplication. In mathematics, the exterior product or wedge product of vectors is an algebraic construction used in geometry to study areas, volumes, and their higher-dimensional analogues. The exterior product of two vectors u and v, denoted by u \wedge v, is called a bivector and lives in a space called the ''exterior square'', a vector space that is distinct from the original space of vectors. The magnitude of u \wedge v can be interpreted as the area of the parallelogram with sides u and v, which in three dimensions can also be computed using the cross product of the two vectors. More generally, all parallel plane surfaces with the same orientation and area have the same bivector as a measure of their oriented area. Like the cross product, the exterior product is anticommutative, meaning t ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Trace (matrix)
In linear algebra, the trace of a square matrix , denoted , is defined to be the sum of elements on the main diagonal (from the upper left to the lower right) of . The trace is only defined for a square matrix (). It can be proved that the trace of a matrix is the sum of its (complex) eigenvalues (counted with multiplicities). It can also be proved that for any two matrices and . This implies that similar matrices have the same trace. As a consequence one can define the trace of a linear operator mapping a finite-dimensional vector space into itself, since all matrices describing such an operator with respect to a basis are similar. The trace is related to the derivative of the determinant (see Jacobi's formula). Definition The trace of an square matrix is defined as \operatorname(\mathbf) = \sum_^n a_ = a_ + a_ + \dots + a_ where denotes the entry on the th row and th column of . The entries of can be real numbers or (more generally) complex numbers. The trace is not def ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Polynomial Expression
In mathematics, especially in the field of algebra, a polynomial ring or polynomial algebra is a ring (which is also a commutative algebra) formed from the set of polynomials in one or more indeterminates (traditionally also called variables) with coefficients in another ring, often a field. Often, the term "polynomial ring" refers implicitly to the special case of a polynomial ring in one indeterminate over a field. The importance of such polynomial rings relies on the high number of properties that they have in common with the ring of the integers. Polynomial rings occur and are often fundamental in many parts of mathematics such as number theory, commutative algebra, and algebraic geometry. In ring theory, many classes of rings, such as unique factorization domains, regular rings, group rings, rings of formal power series, Ore polynomials, graded rings, have been introduced for generalizing some properties of polynomial rings. A closely related notion is that of the ring ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Minimal Polynomial (linear Algebra)
In linear algebra, the minimal polynomial of an matrix over a field is the monic polynomial over of least degree such that . Any other polynomial with is a (polynomial) multiple of . The following three statements are equivalent: # is a root of , # is a root of the characteristic polynomial of , # is an eigenvalue of matrix . The multiplicity of a root of is the largest power such that ''strictly'' contains . In other words, increasing the exponent up to will give ever larger kernels, but further increasing the exponent beyond will just give the same kernel. If the field is not algebraically closed, then the minimal and characteristic polynomials need not factor according to their roots (in ) alone, in other words they may have irreducible polynomial factors of degree greater than . For irreducible polynomials one has similar equivalences: # divides , # divides , # the kernel of has dimension at least . # the kernel of has dimension at least . Like the c ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Root Of A Function
In mathematics, a zero (also sometimes called a root) of a real-, complex-, or generally vector-valued function f, is a member x of the domain of f such that f(x) ''vanishes'' at x; that is, the function f attains the value of 0 at x, or equivalently, x is the solution to the equation f(x) = 0. A "zero" of a function is thus an input value that produces an output of 0. A root of a polynomial is a zero of the corresponding polynomial function. The fundamental theorem of algebra shows that any non-zero polynomial has a number of roots at most equal to its degree, and that the number of roots and the degree are equal when one considers the complex roots (or more generally, the roots in an algebraically closed extension) counted with their multiplicities. For example, the polynomial f of degree two, defined by f(x)=x^2-5x+6 has the two roots (or zeros) that are 2 and 3. f(2)=2^2-5\times 2+6= 0\textf(3)=3^2-5\times 3+6=0. If the function maps real numbers to real numbers, then it ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Hyperbolic Angle
In geometry, hyperbolic angle is a real number determined by the area of the corresponding hyperbolic sector of ''xy'' = 1 in Quadrant I of the Cartesian plane. The hyperbolic angle parametrises the unit hyperbola, which has hyperbolic functions as coordinates. In mathematics, hyperbolic angle is an invariant measure as it is preserved under hyperbolic rotation. The hyperbola ''xy'' = 1 is rectangular with a semi-major axis of \sqrt 2, analogous to the magnitude of a circular angle corresponding to the area of a circular sector in a circle with radius \sqrt 2. Hyperbolic angle is used as the independent variable for the hyperbolic functions sinh, cosh, and tanh, because these functions may be premised on hyperbolic analogies to the corresponding circular trigonometric functions by regarding a hyperbolic angle as defining a hyperbolic triangle. The parameter thus becomes one of the most useful in the calculus of real variables. Definition Consider the rectangular hyperbola \text ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |