HOME
*





List Of Functional Analysis Topics
This is a list of functional analysis topics, by Wikipedia page. See also: Glossary of functional analysis. Hilbert space Functional analysis, classic results Operator theory Banach space examples * Lp space *Hardy space *Sobolev space * Tsirelson space *ba space Real and complex algebras Topological vector spaces Amenability *Amenable group *Von Neumann conjecture Wavelets Quantum theory ''See also list of mathematical topics in quantum theory'' {{columns-list, colwidth=20em, *Mathematical formulation of quantum mechanics *Observable * Operator (physics) *Quantum state **Pure state **Fock state, Fock space **Density state **Coherent state *Heisenberg picture *Density matrix *Quantum logic *Quantum operation *Wightman axioms Probability *Free probability * Bernstein's theorem Non-linear * Fixed-point theorems in infinite-dimensional spaces History *Stefan Banach (1892–1945) *Hugo Steinhaus (1887–1972) *John von Neumann (1903-1957) *Alain Connes (bor ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Functional Analysis
Functional analysis is a branch of mathematical analysis, the core of which is formed by the study of vector spaces endowed with some kind of limit-related structure (e.g. Inner product space#Definition, inner product, Norm (mathematics)#Definition, norm, Topological space#Definition, topology, etc.) and the linear transformation, linear functions defined on these spaces and respecting these structures in a suitable sense. The historical roots of functional analysis lie in the study of function space, spaces of functions and the formulation of properties of transformations of functions such as the Fourier transform as transformations defining continuous function, continuous, unitary operator, unitary etc. operators between function spaces. This point of view turned out to be particularly useful for the study of differential equations, differential and integral equations. The usage of the word ''functional (mathematics), functional'' as a noun goes back to the calculus of variati ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Orthogonal Complement
In the mathematical fields of linear algebra and functional analysis, the orthogonal complement of a subspace ''W'' of a vector space ''V'' equipped with a bilinear form ''B'' is the set ''W''⊥ of all vectors in ''V'' that are orthogonal to every vector in ''W''. Informally, it is called the perp, short for perpendicular complement. It is a subspace of ''V''. Example Let V = (\R^5, \langle \cdot, \cdot \rangle) be the vector space equipped with the usual dot product \langle \cdot, \cdot \rangle (thus making it an inner product space), and let W = \, with A = \begin 1 & 0\\ 0 & 1\\ 2 & 6\\ 3 & 9\\ 5 & 3\\ \end. then its orthogonal complement W^\perp = \ can also be defined as W^\perp = \, being \tilde = \begin -2 & -3 & -5 \\ -6 & -9 & -3 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end. The fact that every column vector in A is orthogonal to every column vector in \tilde can be checked by direct computation. The fact that the spans of these vectors are orthogonal then follows by b ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Self-adjoint Operator
In mathematics, a self-adjoint operator on an infinite-dimensional complex vector space ''V'' with inner product \langle\cdot,\cdot\rangle (equivalently, a Hermitian operator in the finite-dimensional case) is a linear map ''A'' (from ''V'' to itself) that is its own adjoint. If ''V'' is finite-dimensional with a given orthonormal basis, this is equivalent to the condition that the matrix of ''A'' is a Hermitian matrix, i.e., equal to its conjugate transpose ''A''. By the finite-dimensional spectral theorem, ''V'' has an orthonormal basis such that the matrix of ''A'' relative to this basis is a diagonal matrix with entries in the real numbers. In this article, we consider generalizations of this concept to operators on Hilbert spaces of arbitrary dimension. Self-adjoint operators are used in functional analysis and quantum mechanics. In quantum mechanics their importance lies in the Dirac–von Neumann formulation of quantum mechanics, in which physical observables such as positi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Hermitian Operator
In mathematics, a self-adjoint operator on an infinite-dimensional complex vector space ''V'' with inner product \langle\cdot,\cdot\rangle (equivalently, a Hermitian operator in the finite-dimensional case) is a linear map ''A'' (from ''V'' to itself) that is its own adjoint. If ''V'' is finite-dimensional with a given orthonormal basis, this is equivalent to the condition that the matrix of ''A'' is a Hermitian matrix, i.e., equal to its conjugate transpose ''A''. By the finite-dimensional spectral theorem, ''V'' has an orthonormal basis such that the matrix of ''A'' relative to this basis is a diagonal matrix with entries in the real numbers. In this article, we consider generalizations of this concept to operators on Hilbert spaces of arbitrary dimension. Self-adjoint operators are used in functional analysis and quantum mechanics. In quantum mechanics their importance lies in the Dirac–von Neumann formulation of quantum mechanics, in which physical observables such as positi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Eigenfunction
In mathematics, an eigenfunction of a linear operator ''D'' defined on some function space is any non-zero function f in that space that, when acted upon by ''D'', is only multiplied by some scaling factor called an eigenvalue. As an equation, this condition can be written as Df = \lambda f for some scalar eigenvalue \lambda. The solutions to this equation may also be subject to boundary conditions that limit the allowable eigenvalues and eigenfunctions. An eigenfunction is a type of eigenvector. Eigenfunctions In general, an eigenvector of a linear operator ''D'' defined on some vector space is a nonzero vector in the domain of ''D'' that, when ''D'' acts upon it, is simply scaled by some scalar value called an eigenvalue. In the special case where ''D'' is defined on a function space, the eigenvectors are referred to as eigenfunctions. That is, a function ''f'' is an eigenfunction of ''D'' if it satisfies the equation where λ is a scalar. The solutions to Equation may also ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Eigenvalue
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by \lambda, is the factor by which the eigenvector is scaled. Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated. Formal definition If is a linear transformation from a vector space over a field into itself and is a nonzero vector in , then is an eigenvector of if is a scalar multiple of . This can be written as T(\mathbf) = \lambda \mathbf, where is a scalar in , known as the eigenvalue, characteristic value, or characteristic root ass ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Eigenvector
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by \lambda, is the factor by which the eigenvector is scaled. Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated. Formal definition If is a linear transformation from a vector space over a field into itself and is a nonzero vector in , then is an eigenvector of if is a scalar multiple of . This can be written as T(\mathbf) = \lambda \mathbf, where is a scalar in , known as the eigenvalue, characteristic value, or characteristic root ass ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Diagonal Matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is \left begin 3 & 0 \\ 0 & 2 \end\right/math>, while an example of a 3×3 diagonal matrix is \left begin 6 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end\right/math>. An identity matrix of any size, or any multiple of it (a scalar matrix), is a diagonal matrix. A diagonal matrix is sometimes called a scaling matrix, since matrix multiplication with it results in changing scale (size). Its determinant is the product of its diagonal values. Definition As stated above, a diagonal matrix is a matrix in which all off-diagonal entries are zero. That is, the matrix with ''n'' columns and ''n'' rows is diagonal if \forall i,j \in \, i \ne j \implies d_ = 0. However, the main diagonal entries are unrestricted. The term ''diagonal matrix'' may s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Semi-Hilbert Space
In mathematics, a semi-Hilbert space is a generalization of a Hilbert space in functional analysis, in which, roughly speaking, the inner product is required only to be positive semi-definite rather than positive definite, so that it gives rise to a seminorm In mathematics, particularly in functional analysis, a seminorm is a vector space norm that need not be positive definite. Seminorms are intimately connected with convex sets: every seminorm is the Minkowski functional of some absorbing disk and ... rather than a vector space norm. The quotient of this space by the kernel of this seminorm is also required to be a Hilbert space in the usual sense. References Optimal Interpolation in Semi-Hilbert Spaces Topological vector spaces {{Mathanalysis-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Unitary Matrix
In linear algebra, a complex square matrix is unitary if its conjugate transpose is also its inverse, that is, if U^* U = UU^* = UU^ = I, where is the identity matrix. In physics, especially in quantum mechanics, the conjugate transpose is referred to as the Hermitian adjoint of a matrix and is denoted by a dagger (†), so the equation above is written U^\dagger U = UU^\dagger = I. The real analogue of a unitary matrix is an orthogonal matrix. Unitary matrices have significant importance in quantum mechanics because they preserve norms, and thus, probability amplitudes. Properties For any unitary matrix of finite size, the following hold: * Given two complex vectors and , multiplication by preserves their inner product; that is, . * is normal (U^* U = UU^*). * is diagonalizable; that is, is unitarily similar to a diagonal matrix, as a consequence of the spectral theorem. Thus, has a decomposition of the form U = VDV^*, where is unitary, and is diagonal and uni ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Orthogonal Matrix
In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identity matrix. This leads to the equivalent characterization: a matrix is orthogonal if its transpose is equal to its inverse: Q^\mathrm=Q^, where is the inverse of . An orthogonal matrix is necessarily invertible (with inverse ), unitary (), where is the Hermitian adjoint (conjugate transpose) of , and therefore normal () over the real numbers. The determinant of any orthogonal matrix is either +1 or −1. As a linear transformation, an orthogonal matrix preserves the inner product of vectors, and therefore acts as an isometry of Euclidean space, such as a rotation, reflection or rotoreflection. In other words, it is a unitary transformation. The set of orthogonal matrices, under multiplication, forms the group , known as the orthogonal gr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Normal Operator
In mathematics, especially functional analysis, a normal operator on a complex Hilbert space ''H'' is a continuous linear operator ''N'' : ''H'' → ''H'' that commutes with its hermitian adjoint ''N*'', that is: ''NN*'' = ''N*N''. Normal operators are important because the spectral theorem holds for them. The class of normal operators is well understood. Examples of normal operators are * unitary operators: ''N*'' = ''N−1'' * Hermitian operators (i.e., self-adjoint operators): ''N*'' = ''N'' * Skew-Hermitian operators: ''N*'' = −''N'' * positive operators: ''N'' = ''MM*'' for some ''M'' (so ''N'' is self-adjoint). A normal matrix is the matrix expression of a normal operator on the Hilbert space C''n''. Properties Normal operators are characterized by the spectral theorem. A compact normal operator (in particular, a normal operator on a finite-dimensional linear space) is unitarily diagonalizable. Let T be a bounded operator. The following are equivalent. * T is normal. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]