Specht's Theorem
In mathematics, Specht's theorem gives a necessary and sufficient condition for two Complex number, complex matrix (mathematics), matrices to be Matrix_similarity, unitarily equivalent. It is named after Wilhelm Specht, who proved the theorem in 1940. Two matrices ''A'' and ''B'' with complex number entries are said to be ''unitarily equivalent'' if there exists a unitary matrix ''U'' such that ''B'' = ''U'' *''AU''. Two matrices which are unitarily equivalent are also similar matrices, similar. Two similar matrices represent the same linear map, but with respect to a different basis of a vector space, basis; unitary equivalence corresponds to a change from an orthonormal basis to another orthonormal basis. If ''A'' and ''B'' are unitarily equivalent, then tr ''AA''* = tr ''BB''*, where tr denotes the trace (linear algebra), trace (in other words, the Frobenius norm is a unitary invariant). This follows from the cyclic invariance of the trace: if ''B'' = ''U'' *''AU' ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Necessary And Sufficient Condition
In logic and mathematics, necessity and sufficiency are terms used to describe a conditional or implicational relationship between two statements. For example, in the conditional statement: "If then ", is necessary for , because the truth of is guaranteed by the truth of (equivalently, it is impossible to have without ). Similarly, is sufficient for , because being true always implies that is true, but not being true does not always imply that is not true. In general, a necessary condition is one that must be present in order for another condition to occur, while a sufficient condition is one that produces the said condition. The assertion that a statement is a "necessary ''and'' sufficient" condition of another means that the former statement is true if and only if the latter is true. That is, the two statements must be either simultaneously true, or simultaneously false. In ordinary English (also natural language) "necessary" and "sufficient" indicate relations betwe ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Orthonormal Basis
In mathematics, particularly linear algebra, an orthonormal basis for an inner product space ''V'' with finite dimension is a basis for V whose vectors are orthonormal, that is, they are all unit vectors and orthogonal to each other. For example, the standard basis for a Euclidean space \R^n is an orthonormal basis, where the relevant inner product is the dot product of vectors. The image of the standard basis under a rotation or reflection (or any orthogonal transformation) is also orthonormal, and every orthonormal basis for \R^n arises in this fashion. For a general inner product space V, an orthonormal basis can be used to define normalized orthogonal coordinates on V. Under these coordinates, the inner product becomes a dot product of vectors. Thus the presence of an orthonormal basis reduces the study of a finite-dimensional inner product space to the study of \R^n under dot product. Every finite-dimensional inner product space has an orthonormal basis, which may ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Matrix Theory
In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, \begin1 & 9 & -13 \\20 & 5 & -6 \end is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a "-matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra. Therefore, the study of matrices is a large part of linear algebra, and most properties and operations of abstract linear algebra can be expressed in terms of matrices. For example, matrix multiplication represents composition of linear maps. Not all matrices are related to linear algebra. This is, in particular, the case in graph theory, of incidence matrices, and adjacency matrices. ''This article focuses on matrices related to linear algebra, and, un ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Cambridge University Press
Cambridge University Press is the university press of the University of Cambridge. Granted letters patent by King Henry VIII in 1534, it is the oldest university press in the world. It is also the King's Printer. Cambridge University Press is a department of the University of Cambridge and is both an academic and educational publisher. It became part of Cambridge University Press & Assessment, following a merger with Cambridge Assessment in 2021. With a global sales presence, publishing hubs, and offices in more than 40 countries, it publishes over 50,000 titles by authors from over 100 countries. Its publishing includes more than 380 academic journals, monographs, reference works, school and university textbooks, and English language teaching and learning publications. It also publishes Bibles, runs a bookshop in Cambridge, sells through Amazon, and has a conference venues business in Cambridge at the Pitt Building and the Sir Geoffrey Cass Sports and Social Centre. ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Pacific Journal Of Mathematics
The Pacific Journal of Mathematics is a mathematics research journal supported by several universities and research institutes, and currently published on their behalf by Mathematical Sciences Publishers, a non-profit academic publishing organisation, and the University of California, Berkeley. It was founded in 1951 by František Wolf and Edwin F. Beckenbach and has been published continuously since, with five two-issue volumes per year and 12 issues per year. Full-text PDF versions of all journal articles are available on-line via the journal's website with a subscription. The journal is incorporated as a 501(c)(3) organization A 501(c)(3) organization is a United States corporation, trust, unincorporated association or other type of organization exempt from federal income tax under section 501(c)(3) of Title 26 of the United States Code. It is one of the 29 types of .... References Mathematics journals Publications established in 1951 Mathematical Sciences Publish ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Word (mathematics)
In group theory, a word is any written product of group elements and their inverses. For example, if ''x'', ''y'' and ''z'' are elements of a group ''G'', then ''xy'', ''z''−1''xzz'' and ''y''−1''zxx''−1''yz''−1 are words in the set . Two different words may evaluate to the same value in ''G'', or even in every group. Words play an important role in the theory of free groups and presentations, and are central objects of study in combinatorial group theory. Definitions Let ''G'' be a group, and let ''S'' be a subset of ''G''. A word in ''S'' is any expression of the form :s_1^ s_2^ \cdots s_n^ where ''s''1,...,''sn'' are elements of ''S'', called generators, and each ''εi'' is ±1. The number ''n'' is known as the length of the word. Each word in ''S'' represents an element of ''G'', namely the product of the expression. By convention, the unique Uniqueness of identity element and inverses identity element can be represented by the empty word, which is the unique wo ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Frobenius Norm
In mathematics, a matrix norm is a vector norm in a vector space whose elements (vectors) are matrices (of given dimensions). Preliminaries Given a field K of either real or complex numbers, let K^ be the -vector space of matrices with m rows and n columns and entries in the field K. A matrix norm is a norm on K^. This article will always write such norms with double vertical bars (like so: \, A\, ). Thus, the matrix norm is a function \, \cdot\, : K^ \to \R that must satisfy the following properties: For all scalars \alpha \in K and matrices A, B \in K^, *\, A\, \ge 0 (''positive-valued'') *\, A\, = 0 \iff A=0_ (''definite'') *\left\, \alpha A\right\, =\left, \alpha\ \left\, A\right\, (''absolutely homogeneous'') *\, A+B\, \le \, A\, +\, B\, (''sub-additive'' or satisfying the ''triangle inequality'') The only feature distinguishing matrices from rearranged vectors is multiplication. Matrix norms are particularly useful if they are also sub-multiplicative: *\le ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Trace (linear Algebra)
In linear algebra, the trace of a square matrix , denoted , is defined to be the sum of elements on the main diagonal (from the upper left to the lower right) of . The trace is only defined for a square matrix (). It can be proved that the trace of a matrix is the sum of its (complex) eigenvalues (counted with multiplicities). It can also be proved that for any two matrices and . This implies that similar matrices have the same trace. As a consequence one can define the trace of a linear operator mapping a finite-dimensional vector space into itself, since all matrices describing such an operator with respect to a basis are similar. The trace is related to the derivative of the determinant (see Jacobi's formula). Definition The trace of an square matrix is defined as \operatorname(\mathbf) = \sum_^n a_ = a_ + a_ + \dots + a_ where denotes the entry on the th row and th column of . The entries of can be real numbers or (more generally) complex numbers. The trace is not ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Basis Of A Vector Space
In mathematics, a set of vectors in a vector space is called a basis if every element of may be written in a unique way as a finite linear combination of elements of . The coefficients of this linear combination are referred to as components or coordinates of the vector with respect to . The elements of a basis are called . Equivalently, a set is a basis if its elements are linearly independent and every element of is a linear combination of elements of . In other words, a basis is a linearly independent spanning set. A vector space can have several bases; however all the bases have the same number of elements, called the ''dimension'' of the vector space. This article deals mainly with finite-dimensional vector spaces. However, many of the principles are also valid for infinite-dimensional vector spaces. Definition A basis of a vector space over a field (such as the real numbers or the complex numbers ) is a linearly independent subset of that spans . This ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Complex Number
In mathematics, a complex number is an element of a number system that extends the real numbers with a specific element denoted , called the imaginary unit and satisfying the equation i^= -1; every complex number can be expressed in the form a + bi, where and are real numbers. Because no real number satisfies the above equation, was called an imaginary number by René Descartes. For the complex number a+bi, is called the , and is called the . The set of complex numbers is denoted by either of the symbols \mathbb C or . Despite the historical nomenclature "imaginary", complex numbers are regarded in the mathematical sciences as just as "real" as the real numbers and are fundamental in many aspects of the scientific description of the natural world. Complex numbers allow solutions to all polynomial equations, even those that have no solutions in real numbers. More precisely, the fundamental theorem of algebra asserts that every non-constant polynomial equation with rea ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Linear Map
In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that preserves the operations of vector addition and scalar multiplication. The same names and the same definition are also used for the more general case of modules over a ring; see Module homomorphism. If a linear map is a bijection then it is called a . In the case where V = W, a linear map is called a (linear) ''endomorphism''. Sometimes the term refers to this case, but the term "linear operator" can have different meanings for different conventions: for example, it can be used to emphasize that V and W are real vector spaces (not necessarily with V = W), or it can be used to emphasize that V is a function space, which is a common convention in functional analysis. Sometimes the term ''linear function'' has the same meaning as ''linear ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Similar Matrices
In linear algebra, two ''n''-by-''n'' matrices and are called similar if there exists an invertible ''n''-by-''n'' matrix such that B = P^ A P . Similar matrices represent the same linear map under two (possibly) different bases, with being the change of basis matrix. A transformation is called a similarity transformation or conjugation of the matrix . In the general linear group, similarity is therefore the same as conjugacy, and similar matrices are also called conjugate; however, in a given subgroup of the general linear group, the notion of conjugacy may be more restrictive than similarity, since it requires that be chosen to lie in . Motivating example When defining a linear transformation, it can be the case that a change of basis can result in a simpler form of the same transformation. For example, the matrix representing a rotation in when the axis of rotation is not aligned with the coordinate axis can be complicated to compute. If the axis of rotation were ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |