HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathemat ...
, an orthogonal transformation is a
linear transformation In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
''T'' : ''V'' → ''V'' on a real
inner product space In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
''V'', that preserves the
inner product In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
. That is, for each pair of elements of ''V'', we have : \langle u,v \rangle = \langle Tu,Tv \rangle \, . Since the lengths of vectors and the angles between them are defined through the inner product, orthogonal transformations preserve lengths of vectors and angles between them. In particular, orthogonal transformations map
orthonormal bases In mathematics, particularly linear algebra, an orthonormal basis for an inner product space V with finite dimension is a basis for V whose vectors are orthonormal, that is, they are all unit vectors and orthogonal to each other. For example, th ...
to orthonormal bases. Orthogonal transformations are
injective In mathematics, an injective function (also known as injection, or one-to-one function ) is a function that maps distinct elements of its domain to distinct elements of its codomain; that is, implies (equivalently by contraposition, impl ...
: if Tv = 0 then 0 = \langle Tv,Tv \rangle = \langle v,v \rangle, hence v = 0, so the kernel of T is trivial. Orthogonal transformations in two- or three- dimensional
Euclidean space Euclidean space is the fundamental space of geometry, intended to represent physical space. Originally, in Euclid's ''Elements'', it was the three-dimensional space of Euclidean geometry, but in modern mathematics there are ''Euclidean spaces ...
are stiff rotations, reflections, or combinations of a rotation and a reflection (also known as
improper rotation In geometry, an improper rotation. (also called rotation-reflection, rotoreflection, rotary reflection,. or rotoinversion) is an isometry in Euclidean space that is a combination of a Rotation (geometry), rotation about an axis and a reflection ( ...
s). Reflections are transformations that reverse the direction front to back, orthogonal to the mirror plane, like (real-world) mirrors do. The
matrices Matrix (: matrices or matrixes) or MATRIX may refer to: Science and mathematics * Matrix (mathematics), a rectangular array of numbers, symbols or expressions * Matrix (logic), part of a formula in prenex normal form * Matrix (biology), the ...
corresponding to proper rotations (without reflection) have a
determinant In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
of +1. Transformations with reflection are represented by matrices with a determinant of −1. This allows the concept of rotation and reflection to be generalized to higher dimensions. In finite-dimensional spaces, the matrix representation (with respect to an
orthonormal basis In mathematics, particularly linear algebra, an orthonormal basis for an inner product space V with finite Dimension (linear algebra), dimension is a Basis (linear algebra), basis for V whose vectors are orthonormal, that is, they are all unit vec ...
) of an orthogonal transformation is an
orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identi ...
. Its rows are mutually orthogonal vectors with unit norm, so that the rows constitute an orthonormal basis of ''V''. The columns of the matrix form another orthonormal basis of ''V''. If an orthogonal transformation is
invertible In mathematics, the concept of an inverse element generalises the concepts of opposite () and reciprocal () of numbers. Given an operation denoted here , and an identity element denoted , if , one says that is a left inverse of , and that ...
(which is always the case when ''V'' is finite-dimensional) then its inverse T^ is another orthogonal transformation identical to the transpose of T: T^ = T^.


Examples

Consider the inner-product space (\mathbb^2,\langle\cdot,\cdot\rangle) with the standard Euclidean inner product and standard basis. Then, the matrix transformation : T = \begin \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end : \mathbb^2 \to \mathbb^2 is orthogonal. To see this, consider : \begin Te_1 = \begin\cos(\theta) \\ \sin(\theta)\end && Te_2 = \begin-\sin(\theta) \\ \cos(\theta)\end \end Then, : \begin &\langle Te_1,Te_1\rangle = \begin \cos(\theta) & \sin(\theta) \end \cdot \begin \cos(\theta) \\ \sin(\theta) \end = \cos^2(\theta) + \sin^2(\theta) = 1\\ &\langle Te_1,Te_2\rangle = \begin \cos(\theta) & \sin(\theta) \end \cdot \begin -\sin(\theta) \\ \cos(\theta) \end = \sin(\theta)\cos(\theta) - \sin(\theta)\cos(\theta) = 0\\ &\langle Te_2,Te_2\rangle = \begin -\sin(\theta) & \cos(\theta) \end \cdot \begin -\sin(\theta) \\ \cos(\theta) \end = \sin^2(\theta) + \cos^2(\theta) = 1\\ \end The previous example can be extended to construct all orthogonal transformations. For example, the following matrices define orthogonal transformations on (\mathbb^3,\langle\cdot,\cdot\rangle): : \begin \cos(\theta) & -\sin(\theta) & 0 \\ \sin(\theta) & \cos(\theta) & 0 \\ 0 & 0 & 1 \end, \begin \cos(\theta) & 0 & -\sin(\theta) \\ 0 & 1 & 0 \\ \sin(\theta) & 0 & \cos(\theta) \end, \begin 1 & 0 & 0 \\ 0 & \cos(\theta) & -\sin(\theta) \\ 0 & \sin(\theta) & \cos(\theta) \end


See also

* Geometric transformation *
Improper rotation In geometry, an improper rotation. (also called rotation-reflection, rotoreflection, rotary reflection,. or rotoinversion) is an isometry in Euclidean space that is a combination of a Rotation (geometry), rotation about an axis and a reflection ( ...
*
Linear transformation In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
*
Orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identi ...
* Rigid transformation *
Unitary transformation In mathematics, a unitary transformation is a linear isomorphism that preserves the inner product: the inner product of two vectors before the transformation is equal to their inner product after the transformation. Formal definition More precise ...


References

{{Reflist Linear algebra