Triple System
   HOME
*





Triple System
In algebra, a triple system (or ternar) is a vector space ''V'' over a field F together with a F-trilinear map : (\cdot,\cdot,\cdot) \colon V\times V \times V\to V. The most important examples are Lie triple systems and Jordan triple systems. They were introduced by Nathan Jacobson in 1949 to study subspaces of associative algebras closed under triple commutators ''u'', ''v'' ''w''] and triple Commutator, anticommutators . In particular, any Lie algebra defines a Lie triple system and any Jordan algebra defines a Jordan triple system. They are important in the theories of symmetric spaces, particularly Hermitian symmetric spaces and their generalizations (symmetric R-spaces and their noncompact duals). Lie triple systems A triple system is said to be a ''Lie triple system'' if the trilinear map, denoted cdot,\cdot,\cdot, satisfies the following identities: : ,v,w= - ,u,w : ,v,w+ ,u,v+ ,w,u= 0 : ,x,y.html"_;"title=",v,[w,x,y">,v,[w,x,y_=_u,v,wx,y.html" ;"title=",x,y">,v, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Algebra
Algebra () is one of the broad areas of mathematics. Roughly speaking, algebra is the study of mathematical symbols and the rules for manipulating these symbols in formulas; it is a unifying thread of almost all of mathematics. Elementary algebra deals with the manipulation of variables (commonly represented by Roman letters) as if they were numbers and is therefore essential in all applications of mathematics. Abstract algebra is the name given, mostly in education, to the study of algebraic structures such as groups, rings, and fields (the term is no more in common use outside educational context). Linear algebra, which deals with linear equations and linear mappings, is used for modern presentations of geometry, and has many practical applications (in weather forecasting, for example). There are many areas of mathematics that belong to algebra, some having "algebra" in their name, such as commutative algebra, and some not, such as Galois theory. The word ''alge ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Vector Space
In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called '' vectors'', may be added together and multiplied ("scaled") by numbers called ''scalars''. Scalars are often real numbers, but can be complex numbers or, more generally, elements of any field. The operations of vector addition and scalar multiplication must satisfy certain requirements, called ''vector axioms''. The terms real vector space and complex vector space are often used to specify the nature of the scalars: real coordinate space or complex coordinate space. Vector spaces generalize Euclidean vectors, which allow modeling of physical quantities, such as forces and velocity, that have not only a magnitude, but also a direction. The concept of vector spaces is fundamental for linear algebra, together with the concept of matrix, which allows computing in vector spaces. This provides a concise and synthetic way for manipulating and studying systems of linea ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Multilinear Map
In linear algebra, a multilinear map is a function of several variables that is linear separately in each variable. More precisely, a multilinear map is a function :f\colon V_1 \times \cdots \times V_n \to W\text where V_1,\ldots,V_n and W are vector spaces (or modules over a commutative ring), with the following property: for each i, if all of the variables but v_i are held constant, then f(v_1, \ldots, v_i, \ldots, v_n) is a linear function of v_i. A multilinear map of one variable is a linear map, and of two variables is a bilinear map. More generally, a multilinear map of ''k'' variables is called a ''k''-linear map. If the codomain of a multilinear map is the field of scalars, it is called a multilinear form. Multilinear maps and multilinear forms are fundamental objects of study in multilinear algebra. If all variables belong to the same space, one can consider symmetric, antisymmetric and alternating ''k''-linear maps. The latter coincide if the underlying rin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Nathan Jacobson
Nathan Jacobson (October 5, 1910 – December 5, 1999) was an American mathematician. Biography Born Nachman Arbiser in Warsaw, Jacobson emigrated to America with his family in 1918. He graduated from the University of Alabama in 1930 and was awarded a doctorate in mathematics from Princeton University in 1934. While working on his thesis, ''Non-commutative polynomials and cyclic algebras'', he was advised by Joseph Wedderburn. Jacobson taught and researched at Bryn Mawr College (1935–1936), the University of Chicago (1936–1937), the University of North Carolina at Chapel Hill (1937–1943), and Johns Hopkins University (1943–1947) before joining Yale University in 1947. He remained at Yale until his retirement. He was a member of the National Academy of Sciences and the American Academy of Arts and Sciences. He served as president of the American Mathematical Society from 1971 to 1973, and was awarded their highest honour, the Leroy P. Steele prize for lifetime achievemen ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Associative Algebra
In mathematics, an associative algebra ''A'' is an algebraic structure with compatible operations of addition, multiplication (assumed to be associative), and a scalar multiplication by elements in some field ''K''. The addition and multiplication operations together give ''A'' the structure of a ring; the addition and scalar multiplication operations together give ''A'' the structure of a vector space over ''K''. In this article we will also use the term ''K''-algebra to mean an associative algebra over the field ''K''. A standard first example of a ''K''-algebra is a ring of square matrices over a field ''K'', with the usual matrix multiplication. A commutative algebra is an associative algebra that has a commutative multiplication, or, equivalently, an associative algebra that is also a commutative ring. In this article associative algebras are assumed to have a multiplicative identity, denoted 1; they are sometimes called unital associative algebras for clarification. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Skew Symmetry
In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition In terms of the entries of the matrix, if a_ denotes the entry in the i-th row and j-th column, then the skew-symmetric condition is equivalent to Example The matrix :A = \begin 0 & 2 & -45 \\ -2 & 0 & -4 \\ 45 & 4 & 0 \end is skew-symmetric because : -A = \begin 0 & -2 & 45 \\ 2 & 0 & 4 \\ -45 & -4 & 0 \end = A^\textsf . Properties Throughout, we assume that all matrix entries belong to a field \mathbb whose characteristic is not equal to 2. That is, we assume that , where 1 denotes the multiplicative identity and 0 the additive identity of the given field. If the characteristic of the field is 2, then a skew-symmetric matrix is the same thing as a symmetric matrix. * The sum of two skew-symmetric matrices is skew-symmetric. * A sca ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Jacobi Identity
In mathematics, the Jacobi identity is a property of a binary operation that describes how the order of evaluation, the placement of parentheses in a multiple product, affects the result of the operation. By contrast, for operations with the associative property, any order of evaluation gives the same result (parentheses in a multiple product are not needed). The identity is named after the German mathematician Carl Gustav Jacob Jacobi. The cross product a\times b and the Lie bracket operation ,b/math> both satisfy the Jacobi identity. In analytical mechanics, the Jacobi identity is satisfied by the Poisson brackets. In quantum mechanics, it is satisfied by operator commutators on a Hilbert space and equivalently in the phase space formulation of quantum mechanics by the Moyal bracket. Definition Let + and \times be two binary operations, and let 0 be the neutral element for +. The is :x \times (y \times z) \ +\ y \times (z \times x) \ +\ z \times (x \times y)\ =\ 0. Notic ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Derivation (algebra)
In mathematics, a derivation is a function on an algebra which generalizes certain features of the derivative operator. Specifically, given an algebra ''A'' over a ring or a field ''K'', a ''K''-derivation is a ''K''-linear map that satisfies Leibniz's law: : D(ab) = a D(b) + D(a) b. More generally, if ''M'' is an ''A''-bimodule, a ''K''-linear map that satisfies the Leibniz law is also called a derivation. The collection of all ''K''-derivations of ''A'' to itself is denoted by Der''K''(''A''). The collection of ''K''-derivations of ''A'' into an ''A''-module ''M'' is denoted by . Derivations occur in many different contexts in diverse areas of mathematics. The partial derivative with respect to a variable is an R-derivation on the algebra of real-valued differentiable functions on R''n''. The Lie derivative with respect to a vector field is an R-derivation on the algebra of differentiable functions on a differentiable manifold; more generally it is a derivation on the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Symmetric Space
In mathematics, a symmetric space is a Riemannian manifold (or more generally, a pseudo-Riemannian manifold) whose group of symmetries contains an inversion symmetry about every point. This can be studied with the tools of Riemannian geometry, leading to consequences in the theory of holonomy; or algebraically through Lie theory, which allowed Cartan to give a complete classification. Symmetric spaces commonly occur in differential geometry, representation theory and harmonic analysis. In geometric terms, a complete, simply connected Riemannian manifold is a symmetric space if and only if its curvature tensor is invariant under parallel transport. More generally, a Riemannian manifold (''M'', ''g'') is said to be symmetric if and only if, for each point ''p'' of ''M'', there exists an isometry of ''M'' fixing ''p'' and acting on the tangent space T_pM as minus the identity (every symmetric space is complete, since any geodesic can be extended indefinitely via symmetries about ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]