Irreducible (representation Theory)
   HOME
*





Irreducible (representation Theory)
In mathematics, specifically in the representation theory of groups and algebras, an irreducible representation (\rho, V) or irrep of an algebraic structure A is a nonzero representation that has no proper nontrivial subrepresentation (\rho, _W,W), with W \subset V closed under the action of \. Every finite-dimensional unitary representation on a Hilbert space V is the direct sum of irreducible representations. Irreducible representations are always indecomposable (i.e. cannot be decomposed further into a direct sum of representations), but converse may not hold, e.g. the two-dimensional representation of the real numbers acting by upper triangular unipotent matrices is indecomposable but reducible. History Group representation theory was generalized by Richard Brauer from the 1940s to give modular representation theory, in which the matrix operators act on a vector space over a field K of arbitrary characteristic, rather than a vector space over the field of real number ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Vector Space
In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called ''vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but can be complex numbers or, more generally, elements of any field. The operations of vector addition and scalar multiplication must satisfy certain requirements, called ''vector axioms''. The terms real vector space and complex vector space are often used to specify the nature of the scalars: real coordinate space or complex coordinate space. Vector spaces generalize Euclidean vectors, which allow modeling of physical quantities, such as forces and velocity, that have not only a magnitude, but also a direction. The concept of vector spaces is fundamental for linear algebra, together with the concept of matrix, which allows computing in vector spaces. This provides a concise and synthetic way for manipulating and studying systems of linear eq ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Block Matrix
In mathematics, a block matrix or a partitioned matrix is a matrix that is '' interpreted'' as having been broken into sections called blocks or submatrices. Intuitively, a matrix interpreted as a block matrix can be visualized as the original matrix with a collection of horizontal and vertical lines, which break it up, or partition it, into a collection of smaller matrices. Any matrix may be interpreted as a block matrix in one or more ways, with each interpretation defined by how its rows and columns are partitioned. This notion can be made more precise for an n by m matrix M by partitioning n into a collection \text, and then partitioning m into a collection \text. The original matrix is then considered as the "total" of these groups, in the sense that the (i, j) entry of the original matrix corresponds in a 1-to-1 way with some (s, t) offset entry of some (x,y), where x \in \text and y \in \text. Block matrix algebra arises in general from biproducts in categories of matrices ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Diagonal Matrix
In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is \left begin 3 & 0 \\ 0 & 2 \end\right/math>, while an example of a 3×3 diagonal matrix is \left begin 6 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end\right/math>. An identity matrix of any size, or any multiple of it (a scalar matrix), is a diagonal matrix. A diagonal matrix is sometimes called a scaling matrix, since matrix multiplication with it results in changing scale (size). Its determinant is the product of its diagonal values. Definition As stated above, a diagonal matrix is a matrix in which all off-diagonal entries are zero. That is, the matrix with ''n'' columns and ''n'' rows is diagonal if \forall i,j \in \, i \ne j \implies d_ = 0. However, the main diagonal entries are unrestricted. The term ''diagonal matrix'' may s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Matrix Diagonalization
In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique.) For a finite-dimensional vector space a linear map T:V\to V is called diagonalizable if there exists an ordered basis of V consisting of eigenvectors of T. These definitions are equivalent: if T has a matrix representation T = PDP^ as above, then the column vectors of P form a basis consisting of eigenvectors of and the diagonal entries of D are the corresponding eigenvalues of with respect to this eigenvector basis, A is represented by Diagonalization is the process of finding the above P and Diagonalizable matrices and maps are especially easy for computations, once their eigenvalues and eigenvectors are known. One can raise a diagonal matrix D to a power by simply raising the diagonal entries to that power, and the determina ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Matrix Similarity
In linear algebra, two ''n''-by-''n'' matrices and are called similar if there exists an invertible ''n''-by-''n'' matrix such that B = P^ A P . Similar matrices represent the same linear map under two (possibly) different bases, with being the change of basis matrix. A transformation is called a similarity transformation or conjugation of the matrix . In the general linear group, similarity is therefore the same as conjugacy, and similar matrices are also called conjugate; however, in a given subgroup of the general linear group, the notion of conjugacy may be more restrictive than similarity, since it requires that be chosen to lie in . Motivating example When defining a linear transformation, it can be the case that a change of basis can result in a simpler form of the same transformation. For example, the matrix representing a rotation in when the axis of rotation is not aligned with the coordinate axis can be complicated to compute. If the axis of rotation were ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Group Homomorphism
In mathematics, given two groups, (''G'', ∗) and (''H'', ·), a group homomorphism from (''G'', ∗) to (''H'', ·) is a function ''h'' : ''G'' → ''H'' such that for all ''u'' and ''v'' in ''G'' it holds that : h(u*v) = h(u) \cdot h(v) where the group operation on the left side of the equation is that of ''G'' and on the right side that of ''H''. From this property, one can deduce that ''h'' maps the identity element ''eG'' of ''G'' to the identity element ''eH'' of ''H'', : h(e_G) = e_H and it also maps inverses to inverses in the sense that : h\left(u^\right) = h(u)^. \, Hence one can say that ''h'' "is compatible with the group structure". Older notations for the homomorphism ''h''(''x'') may be ''x''''h'' or ''x''''h'', though this may be confused as an index or a general subscript. In automata theory, sometimes homomorphisms are written to the right of their arguments without parentheses, so that ''h''(''x'') becomes simply xh. In areas of mathematics where one ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Identity Matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial or can be trivially determined by the context. I_1 = \begin 1 \end ,\ I_2 = \begin 1 & 0 \\ 0 & 1 \end ,\ I_3 = \begin 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end ,\ \dots ,\ I_n = \begin 1 & 0 & 0 & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \end. The term unit matrix has also been widely used, but the term ''identity matrix'' is now standard. The term ''unit matrix'' is ambiguous, because it is also used for a matrix of ones and for any unit of the ring of all n\times n matrices. In some fields, such as group theory or quantum mechanics, the identity matrix is sometimes denoted by a boldface one, \mathbf, or called "id" (short for identity). ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Identity Element
In mathematics, an identity element, or neutral element, of a binary operation operating on a set is an element of the set that leaves unchanged every element of the set when the operation is applied. This concept is used in algebraic structures such as groups and rings. The term ''identity element'' is often shortened to ''identity'' (as in the case of additive identity and multiplicative identity) when there is no possibility of confusion, but the identity implicitly depends on the binary operation it is associated with. Definitions Let be a set  equipped with a binary operation ∗. Then an element  of  is called a if for all  in , and a if for all  in . If is both a left identity and a right identity, then it is called a , or simply an . An identity with respect to addition is called an (often denoted as 0) and an identity with respect to multiplication is called a (often denoted as 1). These need not be ordinary additi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Matrix Multiplication
In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The resulting matrix, known as the matrix product, has the number of rows of the first and the number of columns of the second matrix. The product of matrices and is denoted as . Matrix multiplication was first described by the French mathematician Jacques Philippe Marie Binet in 1812, to represent the composition of linear maps that are represented by matrices. Matrix multiplication is thus a basic tool of linear algebra, and as such has numerous applications in many areas of mathematics, as well as in applied mathematics, statistics, physics, economics, and engineering. Computing matrix products is a central operation in all computational applications of linear algebra. Notation This article will use the following notati ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

General Linear Group
In mathematics, the general linear group of degree ''n'' is the set of invertible matrices, together with the operation of ordinary matrix multiplication. This forms a group, because the product of two invertible matrices is again invertible, and the inverse of an invertible matrix is invertible, with identity matrix as the identity element of the group. The group is so named because the columns (and also the rows) of an invertible matrix are linearly independent, hence the vectors/points they define are in general linear position, and matrices in the general linear group take points in general linear position to points in general linear position. To be more precise, it is necessary to specify what kind of objects may appear in the entries of the matrix. For example, the general linear group over R (the set of real numbers) is the group of invertible matrices of real numbers, and is denoted by GL''n''(R) or . More generally, the general linear group of degree ''n'' over any ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Matrix (mathematics)
In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, \begin1 & 9 & -13 \\20 & 5 & -6 \end is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a "-matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra. Therefore, the study of matrices is a large part of linear algebra, and most properties and operations of abstract linear algebra can be expressed in terms of matrices. For example, matrix multiplication represents composition of linear maps. Not all matrices are related to linear algebra. This is, in particular, the case in graph theory, of incidence matrices, and adjacency matrices. ''This article focuses on matrices related to linear algebra, and, unle ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]