Matrix Determinant Lemma
   HOME





Matrix Determinant Lemma
In mathematics, in particular linear algebra, the matrix determinant lemma computes the determinant of the sum of an invertible matrix A and the dyadic product, uvT, of a column vector u and a row vector vT. Statement Suppose A is an invertible square matrix and u, v are column vectors. Then the matrix determinant lemma states that :\det(\mathbf + \mathbf^\textsf) = (1 + \mathbf^\textsf\mathbf^\mathbf)\,\det(\mathbf)\,. Here, uvT is the outer product of two vectors u and v. The theorem can also be stated in terms of the adjugate matrix of A: :\det(\mathbf + \mathbf^\textsf) = \det(\mathbf) + \mathbf^\textsf\mathrm(\mathbf)\mathbf\,, in which case it applies whether or not the matrix A is invertible. Proof First the proof of the special case A = I follows from the equality: : \begin \mathbf & 0 \\ \mathbf^\textsf & 1 \end \begin \mathbf + \mathbf^\textsf & \mathbf \\ 0 & 1 \end \begin \mathbf & 0 \\ -\mathbf^\textsf & 1 \end = \begin \mathbf & \mathbf \\ 0 & 1 + \m ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many areas of mathematics, which include number theory (the study of numbers), algebra (the study of formulas and related structures), geometry (the study of shapes and spaces that contain them), Mathematical analysis, analysis (the study of continuous changes), and set theory (presently used as a foundation for all mathematics). Mathematics involves the description and manipulation of mathematical object, abstract objects that consist of either abstraction (mathematics), abstractions from nature orin modern mathematicspurely abstract entities that are stipulated to have certain properties, called axioms. Mathematics uses pure reason to proof (mathematics), prove properties of objects, a ''proof'' consisting of a succession of applications of in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Adjugate Matrix
In linear algebra, the adjugate or classical adjoint of a square matrix , , is the transpose of its cofactor matrix. It is occasionally known as adjunct matrix, or "adjoint", though that normally refers to a different concept, the adjoint operator which for a matrix is the conjugate transpose. The product of a matrix with its adjugate gives a diagonal matrix (entries not on the main diagonal are zero) whose diagonal entries are the determinant of the original matrix: :\mathbf \operatorname(\mathbf) = \det(\mathbf) \mathbf, where is the identity matrix of the same size as . Consequently, the multiplicative inverse of an invertible matrix can be found by dividing its adjugate by its determinant. Definition The adjugate of is the transpose of the cofactor matrix of , :\operatorname(\mathbf) = \mathbf^\mathsf. In more detail, suppose is a ( unital) commutative ring and is an matrix with entries from . The -'' minor'' of , denoted , is the determinant of the matrix that ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Binomial Inverse Theorem
In mathematics, specifically linear algebra, the Woodbury matrix identity – named after Max A. Woodbury – says that the inverse of a rank-''k'' correction of some matrix can be computed by doing a rank-''k'' correction to the inverse of the original matrix. Alternative names for this formula are the matrix inversion lemma, Sherman–Morrison–Woodbury formula or just Woodbury formula. However, the identity appeared in several papers before the Woodbury report. The Woodbury matrix identity is \left(A + UCV \right)^ = A^ - A^U \left(C^ + VA^U \right)^ VA^, where ''A'', ''U'', ''C'' and ''V'' are conformable matrices: ''A'' is ''n''×''n'', ''C'' is ''k''×''k'', ''U'' is ''n''×''k'', and ''V'' is ''k''×''n''. This can be derived using blockwise matrix inversion. While the identity is primarily used on matrices, it holds in a general ring or in an Ab-category. The Woodbury matrix identity allows cheap computation of inverses and solutions to linear equations. However ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Sherman–Morrison Formula
In linear algebra, the Sherman–Morrison formula, named after Jack Sherman and Winifred J. Morrison, computes the inverse of a "rank-1 update" to a matrix whose inverse has previously been computed. That is, given an invertible matrix A and the outer product u v^\textsf of vectors u and v, the formula cheaply computes an updated matrix inverse \left(A + uv^\textsf\right)\vphantom)^. The Sherman–Morrison formula is a special case of the Woodbury formula. Though named after Sherman and Morrison, it appeared already in earlier publications. Statement Suppose A\in\mathbb^ is an invertible square matrix and u,v\in\mathbb^n are column vectors. Then A + uv^\textsf is invertible if and only if 1 + v^\textsf A^u \neq 0. In this case, :\left(A + uv^\textsf\right)^ = A^ - . Here, uv^\textsf is the outer product of two vectors u and v. The general form shown here is the one published by Bartlett. Proof (\Leftarrow) To prove that the backward direction 1 + v^\textsfA^u \neq 0 \Ri ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Unit Vector
In mathematics, a unit vector in a normed vector space is a Vector (mathematics and physics), vector (often a vector (geometry), spatial vector) of Norm (mathematics), length 1. A unit vector is often denoted by a lowercase letter with a circumflex, or "hat", as in \hat (pronounced "v-hat"). The term ''normalized vector'' is sometimes used as a synonym for ''unit vector''. The normalized vector û of a non-zero vector u is the unit vector in the direction of u, i.e., :\mathbf = \frac=(\frac, \frac, ... , \frac) where ‖u‖ is the Norm (mathematics), norm (or length) of u and \, \mathbf\, = (u_1, u_2, ..., u_n). The proof is the following: \, \mathbf\, =\sqrt=\sqrt=\sqrt=1 A unit vector is often used to represent direction (geometry), directions, such as normal directions. Unit vectors are often chosen to form the basis (linear algebra), basis of a vector space, and every vector in the space may be written as a linear combination form of unit vectors. Orthogonal coordinates ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Triangular Matrices
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are zero. Because matrix equations with triangular matrices are easier to solve, they are very important in numerical analysis. By the LU decomposition algorithm, an invertible matrix may be written as the product of a lower triangular matrix ''L'' and an upper triangular matrix ''U'' if and only if all its leading principal minors are non-zero. Description A matrix of the form :L = \begin \ell_ & & & & 0 \\ \ell_ & \ell_ & & & \\ \ell_ & \ell_ & \ddots & & \\ \vdots & \vdots & \ddots & \ddots & \\ \ell_ & \ell_ & \ldots & \ell_ & \ell_ \end is called a lower triangular matrix or left triangular matrix, and an ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Outer Product
In linear algebra, the outer product of two coordinate vectors is the matrix whose entries are all products of an element in the first vector with an element in the second vector. If the two coordinate vectors have dimensions ''n'' and ''m'', then their outer product is an ''n'' × ''m'' matrix. More generally, given two tensors (multidimensional arrays of numbers), their outer product is a tensor. The outer product of tensors is also referred to as their tensor product, and can be used to define the tensor algebra. The outer product contrasts with: * The dot product (a special case of "inner product"), which takes a pair of coordinate vectors as input and produces a scalar * The Kronecker product, which takes a pair of matrices as input and produces a block matrix * Standard matrix multiplication Definition Given two vectors of size m \times 1 and n \times 1 respectively :\mathbf = \begin u_1 \\ u_2 \\ \vdots \\ u_m \end, \quad \mathbf = \begin v_1 \\ v_2 \\ \vdots \\ v_n \en ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathematics), matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as line (geometry), lines, plane (geometry), planes and rotation (mathematics), rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to Space of functions, function spaces. Linear algebra is also used in most sciences and fields of engineering because it allows mathematical model, modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Square Matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often used to represent simple linear transformations, such as Shear mapping, shearing or Rotation (mathematics), rotation. For example, if R is a square matrix representing a rotation (rotation matrix) and \mathbf is a column vector describing the Position (vector), position of a point in space, the product R\mathbf yields another column vector describing the position of that point after that rotation. If \mathbf is a row vector, the same transformation can be obtained using where R^ is the transpose of Main diagonal The entries a_ () form the main diagonal of a square matrix. They lie on the imaginary line which runs from the top left corner to the bottom right corner of the matrix. For instance, the main diagonal of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Vector (mathematics)
In mathematics and physics, vector is a term that refers to quantities that cannot be expressed by a single number (a scalar), or to elements of some vector spaces. Historically, vectors were introduced in geometry and physics (typically in mechanics) for quantities that have both a magnitude and a direction, such as displacements, forces and velocity. Such quantities are represented by geometric vectors in the same way as distances, masses and time are represented by real numbers. The term ''vector'' is also used, in some contexts, for tuples, which are finite sequences (of numbers or other objects) of a fixed length. Both geometric vectors and tuples can be added and scaled, and these vector operations led to the concept of a vector space, which is a set equipped with a vector addition and a scalar multiplication that satisfy some axioms generalizing the main properties of operations on the above sorts of vectors. A vector space formed by geometric vectors is called a Euc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]