Bendixson's Inequality
   HOME





Bendixson's Inequality
In mathematics, Bendixson's inequality is a quantitative result in the field of matrices derived by Ivar Bendixson in 1902. The inequality puts limits on the imaginary and real parts of characteristic roots (eigenvalues) of real matrices. A special case of this inequality leads to the result that characteristic roots of a real symmetric matrix are always real. The inequality relating to the imaginary parts of characteristic roots of real matrices (Theorem I in ) is stated as: Let A = \left ( a_ \right ) be a real n \times n matrix and \alpha = \max_ \frac \left , a_ - a_ \right , . If \lambda is any characteristic root of A, then : \left , \operatorname (\lambda) \right , \le \alpha \sqrt.\, If A is symmetric then \alpha = 0 and consequently the inequality implies that \lambda must be real. The inequality relating to the real parts of characteristic roots of real matrices (Theorem II in ) is stated as: Let m and M be the smallest and largest characteristic roots of \tfrac, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Matrix (mathematics)
In mathematics, a matrix (: matrices) is a rectangle, rectangular array or table of numbers, symbol (formal), symbols, or expression (mathematics), expressions, with elements or entries arranged in rows and columns, which is used to represent a mathematical object or property of such an object. For example, \begin1 & 9 & -13 \\20 & 5 & -6 \end is a matrix with two rows and three columns. This is often referred to as a "two-by-three matrix", a " matrix", or a matrix of dimension . Matrices are commonly used in linear algebra, where they represent linear maps. In geometry, matrices are widely used for specifying and representing geometric transformations (for example rotation (mathematics), rotations) and coordinate changes. In numerical analysis, many computational problems are solved by reducing them to a matrix computation, and this often involves computing with matrices of huge dimensions. Matrices are used in most areas of mathematics and scientific fields, either directly ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Ivar Bendixson
Ivar Otto Bendixson (1 August 1861 – 29 November 1935) was a Swedish mathematician. Biography Bendixson was born on 1 August 1861 at Villa Bergshyddan, Djurgården, Oscar Parish, Stockholm, Sweden, to a middle-class family. His father Vilhelm Emanuel Bendixson was a merchant, and his mother was Tony Amelia Warburg. On completing secondary education in Stockholm, he obtained his school certificate on 25 May 1878. On 13 September 1878 he enrolled to the Royal Institute of Technology in Stockholm. In 1879 Bendixson went to Uppsala University and graduated with the equivalent of a master's degree on 27 January 1881. Graduating from Uppsala, he went on to study at the newly opened Stockholm University College after which he was awarded a doctorate by Uppsala University on 29 May 1890. On 10 June 1890 Bendixson was appointed as a docent at Stockholm University College. He then worked as an assistant to the professor of mathematical analysis from 5 March 1891 until 31 May 1892. Fro ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Eigenvalues And Eigenvectors
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a constant factor \lambda when the linear transformation is applied to it: T\mathbf v=\lambda \mathbf v. The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor \lambda (possibly a negative or complex number). Geometrically, vectors are multi-dimensional quantities with magnitude and direction, often pictured as arrows. A linear transformation rotates, stretches, or shears the vectors upon which it acts. A linear transformation's eigenvectors are those vectors that are only stretched or shrunk, with neither rotation nor shear. The corresponding eigenvalue is the factor by which an eigenvector is stretched or shrunk. If the eigenvalue is negative, the eigenvector's direction is reversed. Th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Symmetric Matrix
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if a_ denotes the entry in the ith row and jth column then for all indices i and j. Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative. In linear algebra, a real symmetric matrix represents a self-adjoint operator represented in an orthonormal basis over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Gershgorin Circle Theorem
In mathematics, the Gershgorin circle theorem may be used to bound the spectrum of a square matrix. It was first published by the Soviet mathematician Semyon Aronovich Gershgorin in 1931. Gershgorin's name has been transliterated in several different ways, including Geršgorin, Gerschgorin, Gershgorin, Hershhorn, and Hirschhorn. Statement and proof Let A be a complex n\times n matrix, with entries a_. For i \in\ let R_i be the sum of the absolute values of the non-diagonal entries in the i-th row: : R_i= \sum_ \left, a_\. Let D(a_, R_i) \subseteq \Complex be a closed disc centered at a_ with radius R_i. Such a disc is called a Gershgorin disc. :Theorem. Every eigenvalue of A lies within at least one of the Gershgorin discs D(a_,R_i). ''Proof.'' Let \lambda be an eigenvalue of A with corresponding eigenvector x = (x_j). Find ''i'' such that the element of ''x'' with the largest absolute value is x_i. Since Ax=\lambda x, in particular we take the ''i''th component of that eq ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Abstract Algebra
In mathematics, more specifically algebra, abstract algebra or modern algebra is the study of algebraic structures, which are set (mathematics), sets with specific operation (mathematics), operations acting on their elements. Algebraic structures include group (mathematics), groups, ring (mathematics), rings, field (mathematics), fields, module (mathematics), modules, vector spaces, lattice (order), lattices, and algebra over a field, algebras over a field. The term ''abstract algebra'' was coined in the early 20th century to distinguish it from older parts of algebra, and more specifically from elementary algebra, the use of variable (mathematics), variables to represent numbers in computation and reasoning. The abstract perspective on algebra has become so fundamental to advanced mathematics that it is simply called "algebra", while the term "abstract algebra" is seldom used except in mathematical education, pedagogy. Algebraic structures, with their associated homomorphisms, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathematics), matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as line (geometry), lines, plane (geometry), planes and rotation (mathematics), rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to Space of functions, function spaces. Linear algebra is also used in most sciences and fields of engineering because it allows mathematical model, modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]