Bunch–Nielsen–Sorensen Formula
   HOME
*





Bunch–Nielsen–Sorensen Formula
In mathematics, in particular linear algebra, the Bunch–Nielsen–Sorensen formula, named after James R. Bunch, Christopher P. Nielsen and Danny C. Sorensen, expresses the eigenvectors of the sum of a symmetric matrix A and the outer product, v v^T, of vector (mathematics), vector v with itself. Statement Let \lambda_i denote the eigenvalues of A and \tilde\lambda_i denote the eigenvalues of the updated matrix \tilde A = A + v v^T. In the special case when A is diagonal, the eigenvectors \tilde q_i of \tilde A can be written : (\tilde q_i)_k = \frac where N_i is a number that makes the vector \tilde q_i normalized. Derivation This formula can be derived from the Sherman–Morrison formula by examining the poles of (A-\tilde\lambda I+vv^T)^. Remarks The eigenvalues of \tilde A were studied by Golub. Numerical stability of the computation is studied by Gu and Eisenstat. See also * Sherman–Morrison formula References External links Rank-One Modification of the Symme ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as lines, planes and rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to spaces of functions. Linear algebra is also used in most sciences and fields of engineering, because it allows modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order approximations, using the fact that the differential of a multivariate function at a point is the linear ma ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Symmetric Matrix
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if a_ denotes the entry in the ith row and jth column then for all indices i and j. Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative. In linear algebra, a real symmetric matrix represents a self-adjoint operator represented in an orthonormal basis over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refe ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Outer Product
In linear algebra, the outer product of two coordinate vector In linear algebra, a coordinate vector is a representation of a vector as an ordered list of numbers (a tuple) that describes the vector in terms of a particular ordered basis. An easy example may be a position such as (5, 2, 1) in a 3-dimensiona ...s is a Matrix (mathematics), matrix. If the two vectors have dimensions ''n'' and ''m'', then their outer product is an ''n'' × ''m'' matrix. More generally, given two tensors (multidimensional arrays of numbers), their outer product is a tensor. The outer product of tensors is also referred to as their tensor product, and can be used to define the tensor algebra. The outer product contrasts with: * The dot product (a special case of "inner product"), which takes a pair of coordinate vectors as input and produces a Scalar (mathematics), scalar * The Kronecker product, which takes a pair of matrices as input and produces a block matrix * Matrix multiplication, Standard mat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Vector (mathematics)
In mathematics and physics, vector is a term that refers colloquially to some quantities that cannot be expressed by a single number (a scalar), or to elements of some vector spaces. Historically, vectors were introduced in geometry and physics (typically in mechanics) for quantities that have both a magnitude and a direction, such as displacements, forces and velocity. Such quantities are represented by geometric vectors in the same way as distances, masses and time are represented by real numbers. The term ''vector'' is also used, in some contexts, for tuples, which are finite sequences of numbers of a fixed length. Both geometric vectors and tuples can be added and scaled, and these vector operations led to the concept of a vector space, which is a set equipped with a vector addition and a scalar multiplication that satisfy some axioms generalizing the main properties of operations on the above sorts of vectors. A vector space formed by geometric vectors is called a Euclidean ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Sherman–Morrison Formula
In mathematics, in particular linear algebra, the Sherman–Morrison formula, named after Jack Sherman and Winifred J. Morrison, computes the inverse of the sum of an invertible matrix A and the outer product, u v^\textsf, of vectors u and v. The Sherman–Morrison formula is a special case of the Woodbury formula. Though named after Sherman and Morrison, it appeared already in earlier publications. Statement Suppose A\in\mathbb^ is an invertible square matrix and u,v\in\mathbb^n are column vectors. Then A + uv^\textsf is invertible iff 1 + v^\textsf A^u \neq 0. In this case, :\left(A + uv^\textsf\right)^ = A^ - . Here, uv^\textsf is the outer product of two vectors u and v. The general form shown here is the one published by Bartlett. Proof (\Leftarrow) To prove that the backward direction 1 + v^\textsfA^u \neq 0 \Rightarrow A + uv^\textsf is invertible with inverse given as above) is true, we verify the properties of the inverse. A matrix Y (in this case the right-ha ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]