Fundamental Theorem Of Noncommutative Algebra
   HOME

TheInfoList



OR:

In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, an invariant subspace of a
linear mapping In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
''T'' : ''V'' → ''V '' i.e. from some
vector space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called ''vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but can ...
''V'' to itself, is a subspace ''W'' of ''V'' that is preserved by ''T''; that is, ''T''(''W'') ⊆ ''W''.


General description

Consider a linear mapping T :T: W \to W. An invariant subspace W of T has the property that all vectors \mathbf \in W are transformed by T into vectors also contained in W. This can be stated as :\mathbf \in W \implies T(\mathbf) \in W.


Trivial examples of invariant subspaces

* \mathbb^n: Since T maps every vector in \mathbb^n into \mathbb^n. * \: Since a linear map has to map 0 \mapsto 0.


1-dimensional invariant subspace ''U''

A
basis Basis may refer to: Finance and accounting *Adjusted basis, the net cost of an asset after adjusting for various tax-related items *Basis point, 0.01%, often used in the context of interest rates *Basis trading, a trading strategy consisting of ...
of a 1-dimensional space is simply a non-zero vector \mathbf. Consequently, any vector \mathbf \in U can be represented as \lambda \mathbf where \lambda is a scalar. If we represent T by a
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
A then, for U to be an invariant subspace it must satisfy : \forall \mathbf \in U \; \exists \alpha \in \mathbb: A\mathbf = \alpha \mathbf. We know that \mathbf \in U \Rightarrow \mathbf=\beta \mathbf with \beta \in \mathbb. Therefore, the condition for existence of a 1-dimensional invariant subspace is expressed as: :A\mathbf=\lambda \mathbf, where \lambda is a scalar (in the base
field Field may refer to: Expanses of open ground * Field (agriculture), an area of land used for agricultural purposes * Airfield, an aerodrome that lacks the infrastructure of an airport * Battlefield * Lawn, an area of mowed grass * Meadow, a grass ...
of the vector space. Note that this is the typical formulation of an
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
problem, which means that any eigenvector of A forms a 1-dimensional invariant subspace in T.


Formal description

An invariant subspace of a
linear mapping In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
:T:V \to V from some
vector space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called ''vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but can ...
''V'' to itself is a subspace ''W'' of ''V'' such that ''T''(''W'') is contained in ''W''. An invariant subspace of ''T'' is also said to be ''T'' invariant. If ''W'' is ''T''-invariant, we can restrict ''T'' to ''W'' to arrive at a new linear mapping : T, _W : W \to W. This linear mapping is called the restriction of ''T'' on ''W'' and is defined by : T, _W(\mathbf) = T(\mathbf) \text \mathbf \in W. Next, we give a few immediate examples of invariant subspaces. Certainly ''V'' itself, and the subspace , are trivially invariant subspaces for every linear operator ''T'' : ''V'' → ''V''. For certain linear operators there is no ''non-trivial'' invariant subspace; consider for instance a
rotation Rotation, or spin, is the circular movement of an object around a '' central axis''. A two-dimensional rotating object has only one possible central axis and can rotate in either a clockwise or counterclockwise direction. A three-dimensional ...
of a two-dimensional real vector space. Let v be an eigenvector of ''T'', i.e. ''T'' v = λv. Then ''W'' =
span Span may refer to: Science, technology and engineering * Span (unit), the width of a human hand * Span (engineering), a section between two intermediate supports * Wingspan, the distance between the wingtips of a bird or aircraft * Sorbitan es ...
is ''T''-invariant. As a consequence of the fundamental theorem of algebra, every linear operator on a nonzero
finite-dimensional In mathematics, the dimension of a vector space ''V'' is the cardinality (i.e., the number of vectors) of a basis of ''V'' over its base field. p. 44, §2.36 It is sometimes called Hamel dimension (after Georg Hamel) or algebraic dimension to disti ...
complex vector space has an eigenvector. Therefore, every such linear operator has a non-trivial invariant subspace. The fact that the complex numbers are an
algebraically closed field In mathematics, a field is algebraically closed if every non-constant polynomial in (the univariate polynomial ring with coefficients in ) has a root in . Examples As an example, the field of real numbers is not algebraically closed, because ...
is required here. Comparing with the previous example, one can see that the invariant subspaces of a linear transformation are dependent upon the base field of ''V''. An invariant vector (i.e. a fixed point of ''T''), other than 0, spans an invariant subspace of dimension 1. An invariant subspace of dimension 1 will be acted on by ''T'' by a scalar and consists of invariant vectors if and only if that scalar is 1. As the above examples indicate, the invariant subspaces of a given linear transformation ''T'' shed light on the structure of ''T''. When ''V'' is a finite-dimensional vector space over an algebraically closed field, linear transformations acting on ''V'' are characterized (up to similarity) by the
Jordan canonical form In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to so ...
, which decomposes ''V'' into invariant subspaces of ''T''. Many fundamental questions regarding ''T'' can be translated to questions about invariant subspaces of ''T''. More generally, invariant subspaces are defined for sets of operators as subspaces invariant for each operator in the set. Let ''L''(''V'') denote the
algebra Algebra () is one of the broad areas of mathematics. Roughly speaking, algebra is the study of mathematical symbols and the rules for manipulating these symbols in formulas; it is a unifying thread of almost all of mathematics. Elementary a ...
of linear transformations on ''V'', and Lat(''T'') be the family of subspaces invariant under ''T'' ∈ ''L''(''V''). (The "Lat" notation refers to the fact that Lat(''T'') forms a lattice; see discussion below.) Given a nonempty set Σ ⊂ ''L''(''V''), one considers the invariant subspaces invariant under each ''T'' ∈ Σ. In symbols, :\operatorname(\Sigma) = \bigcap_ \operatorname(T). For instance, it is clear that if Σ = ''L''(''V''), then Lat(Σ) = . Given a
representation Representation may refer to: Law and politics *Representation (politics), political activities undertaken by elected representatives, as well as other theories ** Representative democracy, type of democracy in which elected officials represent a ...
of a group ''G'' on a vector space ''V'', we have a linear transformation ''T''(''g'') : ''V'' → ''V'' for every element ''g'' of ''G''. If a subspace ''W'' of ''V'' is invariant with respect to all these transformations, then it is a subrepresentation and the group ''G'' acts on ''W'' in a natural way. As another example, let ''T'' ∈ ''L''(''V'') and Σ be the algebra generated by , where 1 is the identity operator. Then Lat(''T'') = Lat(Σ). Because ''T'' lies in Σ trivially, Lat(Σ) ⊂ Lat(''T''). On the other hand, Σ consists of polynomials in 1 and ''T'', and therefore the reverse inclusion holds as well.


Matrix representation

Over a finite-dimensional vector space, every linear transformation ''T'' : ''V'' → ''V'' can be represented by a matrix once a
basis Basis may refer to: Finance and accounting *Adjusted basis, the net cost of an asset after adjusting for various tax-related items *Basis point, 0.01%, often used in the context of interest rates *Basis trading, a trading strategy consisting of ...
of ''V'' has been chosen. Suppose now ''W'' is a ''T''-invariant subspace. Pick a basis ''C'' = of ''W'' and complete it to a basis ''B'' of ''V''. Then, with respect to this basis, the matrix representation of ''T'' takes the form: : T = \begin T_ & T_ \\ 0 & T_ \end where the upper-left block ''T''11 is the restriction of ''T'' to ''W''. In other words, given an invariant subspace ''W'' of ''T'', ''V'' can be decomposed into the
direct sum The direct sum is an operation between structures in abstract algebra, a branch of mathematics. It is defined differently, but analogously, for different kinds of structures. To see how the direct sum is used in abstract algebra, consider a more ...
:V = W \oplus W'. Viewing ''T'' as an operator matrix : T = \begin T_ & T_ \\ T_ & T_ \end : \beginW \\ \oplus \\ W' \end \rightarrow \beginW \\ \oplus \\ W' \end, it is clear that ''T''21: ''W'' → ''W' '' must be zero. Determining whether a given subspace ''W'' is invariant under ''T'' is ostensibly a problem of geometric nature. Matrix representation allows one to phrase this problem algebraically. The
projection operator In linear algebra and functional analysis, a projection is a linear transformation P from a vector space to itself (an endomorphism) such that P\circ P=P. That is, whenever P is applied twice to any vector, it gives the same result as if it wer ...
''P'' onto ''W'' is defined by ''P''(''w'' + ''w′'') = ''w'', where ''w'' ∈ ''W'' and ''w′'' ∈ ''W. The projection ''P'' has matrix representation : P = \begin 1 & 0 \\ 0 & 0 \end : \beginW \\ \oplus \\ W' \end \rightarrow \beginW \\ \oplus \\ W' \end. A straightforward calculation shows that ''W'' = ran ''P'', the range of ''P'', is invariant under ''T'' if and only if ''PTP'' = ''TP''. In other words, a subspace ''W'' being an element of Lat(''T'') is equivalent to the corresponding projection satisfying the relation ''PTP'' = ''TP''. If ''P'' is a projection (i.e. ''P''2 = ''P'') then so is 1 − ''P'', where 1 is the identity operator. It follows from the above that ''TP'' = ''PT'' if and only if both ran ''P'' and ran(1 − ''P'') are invariant under ''T''. In that case, ''T'' has matrix representation : T = \begin T_ & 0 \\ 0 & T_ \end : \begin \operatornameP \\ \oplus \\ \operatorname(1-P) \end \rightarrow \begin \operatornameP \\ \oplus \\ \operatorname(1-P) \end \;. Colloquially, a projection that commutes with ''T'' "diagonalizes" ''T''.


Invariant subspace problem

: The invariant subspace problem concerns the case where ''V'' is a separable
Hilbert space In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natural ...
over the
complex number In mathematics, a complex number is an element of a number system that extends the real numbers with a specific element denoted , called the imaginary unit and satisfying the equation i^= -1; every complex number can be expressed in the form ...
s, of dimension > 1, and ''T'' is a bounded operator. The problem is to decide whether every such ''T'' has a non-trivial, closed, invariant subspace. This problem is unsolved . In the more general case where ''V'' is assumed to be a
Banach space In mathematics, more specifically in functional analysis, a Banach space (pronounced ) is a complete normed vector space. Thus, a Banach space is a vector space with a metric that allows the computation of vector length and distance between vector ...
, there is an example of an operator without an invariant subspace due to Per Enflo (1976). A concrete example of an operator without an invariant subspace was produced in 1985 by Charles Read.


Invariant-subspace lattice

Given a nonempty set Σ ⊂ ''L''(''V''), the invariant subspaces invariant under each element of Σ form a lattice, sometimes called the invariant-subspace lattice of Σ and denoted by Lat(Σ). The lattice operations are defined in a natural way: for Σ′ ⊂ Σ, the ''meet'' operation is defined by :\bigwedge_ W = \bigcap_ W while the ''join'' operation is defined by :\bigvee_ W = \operatorname \bigcup_ W. A minimal element in Lat(Σ) in said to be a minimal invariant subspace.


Fundamental theorem of noncommutative algebra

Just as the fundamental theorem of algebra ensures that every linear transformation acting on a finite-dimensional complex vector space has a nontrivial invariant subspace, the ''fundamental theorem of noncommutative algebra'' asserts that Lat(Σ) contains nontrivial elements for certain Σ. Theorem (Burnside) Assume ''V'' is a complex vector space of finite dimension. For every proper subalgebra Σ of ''L''(''V''), Lat(Σ) contains a nontrivial element. Burnside's theorem is of fundamental importance in
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. ...
. One consequence is that every commuting family in ''L''(''V'') can be simultaneously upper-triangularized. A nonempty set Σ ⊂ ''L''(''V'') is said to be triangularizable if there exists a basis of ''V'' such that :\operatorname \ \in \operatorname(\Sigma) \ \text\ k \geq 1 \;. In other words, Σ is triangularizable if there exists a basis such that every element of Σ has an upper-triangular matrix representation in that basis. It follows from Burnside's theorem that every commutative algebra Σ in ''L''(''V'') is triangularizable. Hence every commuting family in ''L''(''V'') can be simultaneously upper-triangularized.


Left ideals

If ''A'' is an
algebra Algebra () is one of the broad areas of mathematics. Roughly speaking, algebra is the study of mathematical symbols and the rules for manipulating these symbols in formulas; it is a unifying thread of almost all of mathematics. Elementary a ...
, one can define a ''left regular representation'' Φ on ''A'': Φ(''a'')''b'' = ''ab'' is a
homomorphism In algebra, a homomorphism is a structure-preserving map between two algebraic structures of the same type (such as two groups, two rings, or two vector spaces). The word ''homomorphism'' comes from the Ancient Greek language: () meaning "same" ...
from ''A'' to ''L''(''A''), the algebra of linear transformations on ''A'' The invariant subspaces of Φ are precisely the left ideals of ''A''. A left ideal ''M'' of ''A'' gives a subrepresentation of ''A'' on ''M''. If ''M'' is a left
ideal Ideal may refer to: Philosophy * Ideal (ethics), values that one actively pursues as goals * Platonic ideal, a philosophical idea of trueness of form, associated with Plato Mathematics * Ideal (ring theory), special subsets of a ring considere ...
of ''A'' then the left regular representation Φ on ''M'' now descends to a representation Φ' on the
quotient vector space In linear algebra, the quotient of a vector space ''V'' by a subspace ''N'' is a vector space obtained by "collapsing" ''N'' to zero. The space obtained is called a quotient space and is denoted ''V''/''N'' (read "''V'' mod ''N''" or "''V'' by ' ...
''A''/''M''. If 'b''denotes an
equivalence class In mathematics, when the elements of some set S have a notion of equivalence (formalized as an equivalence relation), then one may naturally split the set S into equivalence classes. These equivalence classes are constructed so that elements a ...
in ''A''/''M'', Φ'(''a'') 'b''= 'ab'' The kernel of the representation Φ' is the set . The representation Φ' is
irreducible In philosophy, systems theory, science, and art, emergence occurs when an entity is observed to have properties its parts do not have on their own, properties or behaviors that emerge only when the parts interact in a wider whole. Emergence ...
if and only if ''M'' is a maximal left ideal, since a subspace ''V'' ⊂ ''A''/''M'' is an invariant under if and only if its preimage under the quotient map, ''V'' + ''M'', is a left ideal in ''A''.


Almost-invariant halfspaces

Related to invariant subspaces are so-called almost-invariant-halfspaces (AIHS's). A closed subspace Y of a Banach space X is said to be almost-invariant under an operator T \in \mathcal(X) if TY \subseteq Y+E for some finite-dimensional subspace E; equivalently, Y is almost-invariant under T if there is a
finite-rank operator In functional analysis, a branch of mathematics, a finite-rank operator is a bounded linear operator between Banach spaces whose range is finite-dimensional. Finite-rank operators on a Hilbert space A canonical form Finite-rank operators are ...
F \in \mathcal(X) such that (T+F)Y \subseteq Y, i.e. if Y is invariant (in the usual sense) under T+F. In this case, the minimum possible dimension of E (or rank of F) is called the defect. Clearly, every finite-dimensional and finite-codimensional subspace is almost-invariant under every operator. Thus, to make things nontrivial, we say that Y is a halfspace whenever it is a closed subspace with infinite dimension and infinite codimension. The AIHS problem asks whether every operator admits an AIHS. In the complex setting it has already been solved; that is, if X is a complex infinite-dimensional Banach space and T \in \mathcal(X) then T admits an AIHS of defect at most 1. It is not currently known whether the same holds if X is a real Banach space. However, some partial results have been established: for instance, any self-adjoint operator on an infinite-dimensional real Hilbert space admits an AIHS, as does any strictly singular (or compact) operator acting on a real infinite-dimensional reflexive space.


See also

*
Invariant manifold In dynamical systems, a branch of mathematics, an invariant manifold is a topological manifold that is invariant under the action of the dynamical system. Examples include the slow manifold, center manifold, stable manifold, stable manifold, unsta ...


Bibliography

* * * * * * {{cite book , first1=Heydar , last1=Radjavi , first2=Peter , last2=Rosenthal , title=Invariant Subspaces , year=2003 , edition=Update of 1973 Springer-Verlag , isbn=0-486-42822-2 , publisher=Dover Publications Linear algebra Operator theory Representation theory