HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathemat ...
, a coordinate vector is a representation of a
vector Vector most often refers to: * Euclidean vector, a quantity with a magnitude and a direction * Disease vector, an agent that carries and transmits an infectious pathogen into another living organism Vector may also refer to: Mathematics a ...
as an ordered list of numbers (a
tuple In mathematics, a tuple is a finite sequence or ''ordered list'' of numbers or, more generally, mathematical objects, which are called the ''elements'' of the tuple. An -tuple is a tuple of elements, where is a non-negative integer. There is o ...
) that describes the vector in terms of a particular ordered basis. An easy example may be a position such as (5, 2, 1) in a 3-dimensional
Cartesian coordinate system In geometry, a Cartesian coordinate system (, ) in a plane (geometry), plane is a coordinate system that specifies each point (geometry), point uniquely by a pair of real numbers called ''coordinates'', which are the positive and negative number ...
with the basis as the axes of this system. Coordinates are always specified relative to an ordered basis. Bases and their associated coordinate representations let one realize
vector space In mathematics and physics, a vector space (also called a linear space) is a set (mathematics), set whose elements, often called vector (mathematics and physics), ''vectors'', can be added together and multiplied ("scaled") by numbers called sc ...
s and
linear transformation In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
s concretely as
column vector In linear algebra, a column vector with elements is an m \times 1 matrix consisting of a single column of entries, for example, \boldsymbol = \begin x_1 \\ x_2 \\ \vdots \\ x_m \end. Similarly, a row vector is a 1 \times n matrix for some , c ...
s, row vectors, and matrices; hence, they are useful in calculations. The idea of a coordinate vector can also be used for infinite-dimensional vector spaces, as addressed below.


Definition

Let ''V'' be a
vector space In mathematics and physics, a vector space (also called a linear space) is a set (mathematics), set whose elements, often called vector (mathematics and physics), ''vectors'', can be added together and multiplied ("scaled") by numbers called sc ...
of
dimension In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coo ...
''n'' over a field ''F'' and let : B = \ be an ordered basis for ''V''. Then for every v \in V there is a unique
linear combination In mathematics, a linear combination or superposition is an Expression (mathematics), expression constructed from a Set (mathematics), set of terms by multiplying each term by a constant and adding the results (e.g. a linear combination of ''x'' a ...
of the basis vectors that equals '' v '': : v = \alpha _1 b_1 + \alpha _2 b_2 + \cdots + \alpha _n b_n . The coordinate vector of '' v '' relative to ''B'' is the
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is cal ...
of
coordinates In geometry, a coordinate system is a system that uses one or more numbers, or coordinates, to uniquely determine and standardize the Position (geometry), position of the Point (geometry), points or other geometric elements on a manifold such as ...
: B = (\alpha _1, \alpha _2, \ldots, \alpha _n) . This is also called the ''representation of v with respect to B'', or the ''B representation of v ''. The \alpha _1, \alpha _2, \ldots, \alpha _n are called the ''coordinates of v ''. The order of the basis becomes important here, since it determines the order in which the coefficients are listed in the coordinate vector. Coordinate vectors of finite-dimensional vector spaces can be represented by matrices as
column A column or pillar in architecture and structural engineering is a structural element that transmits, through compression, the weight of the structure above to other structural elements below. In other words, a column is a compression member ...
or row vectors. In the above notation, one can write : B = \begin \alpha_1 \\ \vdots \\ \alpha_n \end and : B^T = \begin \alpha_1 & \alpha_2 & \cdots & \alpha_n \end where B^T is the
transpose In linear algebra, the transpose of a Matrix (mathematics), matrix is an operator which flips a matrix over its diagonal; that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other ...
of the matrix B.


The standard representation

We can mechanize the above transformation by defining a function \phi_B, called the ''standard representation of V with respect to B'', that takes every vector to its coordinate representation: \phi_B(v)= B. Then \phi_B is a linear transformation from ''V'' to ''F''''n''. In fact, it is an
isomorphism In mathematics, an isomorphism is a structure-preserving mapping or morphism between two structures of the same type that can be reversed by an inverse mapping. Two mathematical structures are isomorphic if an isomorphism exists between the ...
, and its inverse \phi_B^:F^n\to V is simply :\phi_B^(\alpha_1,\ldots,\alpha_n)=\alpha_1 b_1+\cdots+\alpha_n b_n. Alternatively, we could have defined \phi_B^ to be the above function from the beginning, realized that \phi_B^ is an isomorphism, and defined \phi_B to be its inverse.


Examples


Example 1

Let P_3 be the space of all the algebraic
polynomials In mathematics, a polynomial is a mathematical expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication and exponentiation to nonnegative int ...
of degree at most 3 (i.e. the highest exponent of ''x'' can be 3). This space is linear and spanned by the following polynomials: :B_P = \left\ matching : 1 := \begin 1 \\ 0 \\ 0 \\ 0 \end ; \quad x := \begin 0 \\ 1 \\ 0 \\ 0 \end ; \quad x^2 := \begin 0 \\ 0 \\ 1 \\ 0 \end ; \quad x^3 := \begin 0 \\ 0 \\ 0 \\ 1 \end then the coordinate vector corresponding to the polynomial :p \left( x \right) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 is :\begin a_0 \\ a_1 \\ a_2 \\ a_3 \end. According to that representation, the differentiation operator ''d''/''dx'' which we shall mark ''D'' will be represented by the following
matrix Matrix (: matrices or matrixes) or MATRIX may refer to: Science and mathematics * Matrix (mathematics), a rectangular array of numbers, symbols or expressions * Matrix (logic), part of a formula in prenex normal form * Matrix (biology), the m ...
: :Dp(x) = P'(x) ; \quad = \begin 0 & 1 & 0 & 0 \\ 0 & 0 & 2 & 0 \\ 0 & 0 & 0 & 3 \\ 0 & 0 & 0 & 0 \\ \end Using that method it is easy to explore the properties of the operator, such as: invertibility, Hermitian or anti-Hermitian or neither, spectrum and
eigenvalues In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a ...
, and more.


Example 2

The
Pauli matrices In mathematical physics and mathematics, the Pauli matrices are a set of three complex matrices that are traceless, Hermitian, involutory and unitary. Usually indicated by the Greek letter sigma (), they are occasionally denoted by tau () ...
, which represent the spin operator when transforming the spin eigenstates into vector coordinates.


Basis transformation matrix

Let ''B'' and ''C'' be two different bases of a vector space ''V'', and let us mark with \lbrack M \rbrack_C^B the
matrix Matrix (: matrices or matrixes) or MATRIX may refer to: Science and mathematics * Matrix (mathematics), a rectangular array of numbers, symbols or expressions * Matrix (logic), part of a formula in prenex normal form * Matrix (biology), the m ...
which has columns consisting of the ''C'' representation of basis vectors ''b1, b2, …, bn'': :\lbrack M\rbrack_C^B = \begin \lbrack b_1\rbrack_C & \cdots & \lbrack b_n\rbrack_C \end This matrix is referred to as the basis transformation matrix from ''B'' to ''C''. It can be regarded as an
automorphism In mathematics, an automorphism is an isomorphism from a mathematical object to itself. It is, in some sense, a symmetry of the object, and a way of mapping the object to itself while preserving all of its structure. The set of all automorphism ...
over F^n. Any vector ''v'' represented in ''B'' can be transformed to a representation in ''C'' as follows: :\lbrack v\rbrack_C = \lbrack M\rbrack_C^B \lbrack v\rbrack_B. Under the transformation of basis, notice that the superscript on the transformation matrix, ''M'', and the subscript on the coordinate vector, ''v'', are the same, and seemingly cancel, leaving the remaining subscript. While this may serve as a memory aid, it is important to note that no such cancellation, or similar mathematical operation, is taking place.


Corollary

The matrix ''M'' is an
invertible matrix In linear algebra, an invertible matrix (''non-singular'', ''non-degenarate'' or ''regular'') is a square matrix that has an inverse. In other words, if some other matrix is multiplied by the invertible matrix, the result can be multiplied by a ...
and ''M''−1 is the basis transformation matrix from ''C'' to ''B''. In other words, :\begin \operatorname &= \lbrack M\rbrack_C^B \lbrack M\rbrack_B^C = \lbrack M\rbrack_C^C \\ pt &= \lbrack M\rbrack_B^C \lbrack M\rbrack_C^B = \lbrack M\rbrack_B^B \end


Infinite-dimensional vector spaces

Suppose ''V'' is an infinite-dimensional vector space over a field ''F''. If the dimension is ''κ'', then there is some basis of ''κ'' elements for ''V''. After an order is chosen, the basis can be considered an ordered basis. The elements of ''V'' are finite linear combinations of elements in the basis, which give rise to unique coordinate representations exactly as described before. The only change is that the indexing set for the coordinates is not finite. Since a given vector ''v'' is a ''finite'' linear combination of basis elements, the only nonzero entries of the coordinate vector for ''v'' will be the nonzero coefficients of the linear combination representing ''v''. Thus the coordinate vector for ''v'' is zero except in finitely many entries. The linear transformations between (possibly) infinite-dimensional vector spaces can be modeled, analogously to the finite-dimensional case, with infinite matrices. The special case of the transformations from ''V'' into ''V'' is described in the full linear ring article.


See also

* Change of basis * Coordinate space


References

{{reflist Linear algebra Vectors (mathematics and physics)