Three-dimensional Rotation Operator
   HOME

TheInfoList



OR:

This article derives the main properties of rotations in
3-dimensional space Three-dimensional space (also: 3D space, 3-space or, rarely, tri-dimensional space) is a geometric setting in which three values (called ''parameters'') are required to determine the position of an element (i.e., point). This is the informal ...
. The three
Euler rotations In physics and engineering, Davenport chained rotations are three chained intrinsic rotations about body-fixed specific axes. Euler's rotation theorem, Euler rotations and Tait–Bryan rotations are particular cases of the Davenport general rotation ...
are one way to bring a
rigid body In physics, a rigid body (also known as a rigid object) is a solid body in which deformation is zero or so small it can be neglected. The distance between any two given points on a rigid body remains constant in time regardless of external force ...
to any desired orientation by sequentially making rotations about axis' fixed relative to the object. However, this can also be achieved with one single rotation (
Euler's rotation theorem In geometry, Euler's rotation theorem states that, in three-dimensional space, any displacement of a rigid body such that a point on the rigid body remains fixed, is equivalent to a single rotation about some axis that runs through the fixed p ...
). Using the concepts of
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. ...
it is shown how this single rotation can be performed.


Mathematical formulation

Let be a
coordinate system In geometry, a coordinate system is a system that uses one or more numbers, or coordinates, to uniquely determine the position of the points or other geometric elements on a manifold such as Euclidean space. The order of the coordinates is sig ...
fixed in the body that through a change in orientation is brought to the new directions \mathbf\hat e_1 , \mathbf\hat e_2 , \mathbf\hat e_3 . Any
vector Vector most often refers to: *Euclidean vector, a quantity with a magnitude and a direction *Vector (epidemiology), an agent that carries and transmits an infectious pathogen into another living organism Vector may also refer to: Mathematic ...
\bar x =x_1\hat e_1+x_2\hat e_2+x_3\hat e_3 rotating with the body is then brought to the new direction \mathbf\bar x =x_1\mathbf\hat e_1+x_2\mathbf\hat e_2+x_3\mathbf\hat e_3, that is to say, this is a
linear operator In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
The
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
of this operator relative to the coordinate system is \begin A_ & A_ & A_ \\ A_ & A_ & A_ \\ A_ & A_ & A_ \end = \begin \langle\hat e_1 , \mathbf\hat e_1 \rangle & \langle\hat e_1 , \mathbf\hat e_2 \rangle & \langle\hat e_1 , \mathbf\hat e_3 \rangle \\ \langle\hat e_2 , \mathbf\hat e_1 \rangle & \langle\hat e_2 , \mathbf\hat e_2 \rangle & \langle\hat e_2 , \mathbf\hat e_3 \rangle \\ \langle\hat e_3 , \mathbf\hat e_1 \rangle & \langle\hat e_3 , \mathbf\hat e_2 \rangle & \langle\hat e_3 , \mathbf\hat e_3 \rangle \end. As \sum_^3 A_A_= \langle \mathbf\hat e_i , \mathbf\hat e_j \rangle = \begin 0, & i\neq j, \\ 1, & i = j, \end or equivalently in matrix notation \begin A_ & A_ & A_ \\ A_ & A_ & A_ \\ A_ & A_ & A_ \end^\mathsf \begin A_ & A_ & A_ \\ A_ & A_ & A_ \\ A_ & A_ & A_ \end = \begin 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end, the matrix is
orthogonal In mathematics, orthogonality is the generalization of the geometric notion of ''perpendicularity''. By extension, orthogonality is also used to refer to the separation of specific features of a system. The term also has specialized meanings in ...
and as a right-handed base vector system is reorientated into another right-handed system the
determinant In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and ...
of this matrix has the value 1.


Rotation around an axis

Let be an orthogonal positively oriented base vector system in . The linear operator "rotation by angle around the axis defined by " has the matrix representation \begin Y_1 \\ Y_2 \\ Y_3 \end = \begin \cos\theta & -\sin\theta & 0 \\ \sin\theta & \cos\theta & 0 \\ 0 & 0 & 1 \end \begin X_1 \\ X_2 \\ X_3 \end relative to this basevector system. This then means that a vector \bar x= \begin \hat e_1 & \hat e_2 & \hat e_3 \end \begin X_1 \\ X_2 \\ X_3 \end is rotated to the vector \bar y = \begin \hat e_1 & \hat e_2 & \hat e_3 \end \begin Y_1 \\ Y_2 \\ Y_3 \end by the linear operator. The
determinant In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and ...
of this matrix is \det\begin \cos\theta & -\sin\theta & 0\\ \sin\theta & \cos\theta & 0\\ 0 & 0 & 1 \end = 1 , and the
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The chara ...
is \begin \begin \cos\theta -\lambda & -\sin\theta & 0 \\ \sin\theta & \cos\theta -\lambda & 0 \\ 0 & 0 & 1-\lambda \end &=\left(\left(\cos\theta -\lambda\right)^2 + \sin^2\theta \right)(1-\lambda) \\ &=-\lambda^3+(2 \cos\theta + 1) \lambda^2 - (2 \cos\theta + 1) \lambda + 1 . \end The matrix is symmetric if and only if , that is, for and . The case is the trivial case of an identity operator. For the case the
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The chara ...
is -(\lambda-1)\left(\lambda +1\right)^2, so the rotation operator has the
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
s \lambda=1, \quad \lambda=-1. The
eigenspace In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
corresponding to is all vectors on the rotation axis, namely all vectors \bar x = \alpha \hat e_3, \quad -\infty <\alpha < \infty. The
eigenspace In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
corresponding to consists of all vectors orthogonal to the rotation axis, namely all vectors \bar x = \alpha \hat e_1 + \beta \hat e_2, \quad -\infty <\alpha < \infty, \quad -\infty <\beta < \infty. For all other values of the matrix is not symmetric and as there is only the eigenvalue with the one-dimensional
eigenspace In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
of the vectors on the rotation axis: \bar x =\alpha \hat e_3, \quad -\infty <\alpha < \infty. The rotation matrix by angle around a general axis of rotation is given by
Rodrigues' rotation formula In the theory of three-dimensional rotation, Rodrigues' rotation formula, named after Olinde Rodrigues, is an efficient algorithm for rotating a vector in space, given an axis and angle of rotation. By extension, this can be used to transform al ...
. \mathbf = \mathbf \cos\theta + mathbf\times \sin\theta + (1 - \cos\theta) \mathbf \mathbf^\mathsf, where is the
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial o ...
and is the dual 2-form of or cross product matrix, mathbf\times = \begin 0 & -k_3 & k_2 \\ k_3 & 0 & -k_1 \\ -k_2 & k_1 & 0 \end. Note that satisfies for all vectors .


The general case

The operator "rotation by angle around a specified axis" discussed above is an orthogonal mapping and its matrix relative to any base vector system is therefore an
orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identity ma ...
. Furthermore its determinant has the value 1. A non-trivial fact is the opposite, that for any orthogonal linear mapping in with determinant 1 there exist base vectors such that the matrix takes the "canonical form" \begin \cos\theta & -\sin\theta & 0 \\ \sin\theta & \cos\theta & 0 \\ 0 & 0 & 1 \end for some value of . In fact, if a linear operator has the
orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identity ma ...
\begin A_ & A_ & A_ \\ A_ & A_ & A_ \\ A_ & A_ & A_ \end relative to some base vector system and this matrix is symmetric, the "symmetric operator theorem" valid in (any dimension) applies saying that it has orthogonal eigenvectors. This means for the 3-dimensional case that there exists a coordinate system such that the matrix takes the form \begin B_ & 0 & 0 \\ 0 & B_ & 0 \\ 0 & 0 & B_ \end. As it is an orthogonal matrix these diagonal elements are either 1 or −1. As the determinant is 1 these elements are either all 1 or one of the elements is 1 and the other two are −1. In the first case it is the trivial identity operator corresponding to . In the second case it has the form \begin -1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & 1 \end if the basevectors are numbered such that the one with eigenvalue 1 has index 3. This matrix is then of the desired form for . If the matrix is asymmetric, the vector \bar E = \alpha_1 \hat f_1 + \alpha_2 \hat f_2 + \alpha_3 \hat f_3, where \alpha_1=\frac, \quad \alpha_2=\frac, \quad \alpha_3=\frac is nonzero. This vector is an eigenvector with eigenvalue . Setting \hat e_3=\frac and selecting any two orthogonal unit vectors and in the plane orthogonal to such that form a positively oriented triple, the operator takes the desired form with \begin \cos \theta &= \frac,\\ \sin \theta &= , \bar, . \end The expressions above are in fact valid also for the case of a symmetric rotation operator corresponding to a rotation with or . But the difference is that for the vector \bar E = \alpha_1 \hat f_1 + \alpha_2 \hat f_2 + \alpha_3 \hat f_3 is zero and of no use for finding the eigenspace of eigenvalue 1, and thence the rotation axis. Defining as the matrix for the rotation operator is \frac \begin E_1 E_1 & E_1 E_2 & E_1 E_3 \\ E_2 E_1 & E_2 E_2 & E_2 E_3 \\ E_3 E_1 & E_3 E_2 & E_3 E_3 \end + \begin E_4 & -E_3 & E_2 \\ E_3 & E_4 & -E_1 \\ -E_2 & E_1 & E_4 \end, provided that E_1^2+E_2^2+E_3^2 > 0. That is, except for the cases (the identity operator) and .


Quaternions

Quaternions are defined similar to with the difference that the half angle is used instead of the full angle . This means that the first 3 components components of a vector defined from q_1 \hat_1 + q_2 \hat_2 + q_3 \hat_1 = \sin \frac ,\quad \hat_3=\frac ,\quad \bar E, and that the fourth component is the scalar q_4=\cos \frac. As the angle defined from the canonical form is in the interval 0 \le \theta \le \pi, one would normally have that . But a "dual" representation of a rotation with quaternions is used, that is to say (''q''1, ''q''2, ''q''3, ''q''4) and are two alternative representations of one and the same rotation. The entities are defined from the quaternions by \begin E_1&=2 q_4 q_1, \quad E_2=2 q_4 q_2, \quad E_3=2 q_4 q_3, \\ pxE_4&=q_4^2 -\left(q_1^2+q_2^2+q_3^2\right). \end Using quaternions the matrix of the rotation operator is \begin 2\left(q_1^2+q_4^2\right)-1 &2\left(q_1q_2-q_3q_4\right) &2\left(q_1q_3+q_2q_4\right) \\ 2\left(q_1q_2+q_3q_4\right) &2\left(q_2^2+q_4^2\right)-1 &2\left(q_2q_3-q_1q_4\right) \\ 2\left(q_1q_3-q_2q_4\right) &2\left(q_2q_3+q_1q_4\right) &2\left(q_3^2+q_4^2\right)-1 \end.


Numerical example

Consider the reorientation corresponding to the
Euler angle The Euler angles are three angles introduced by Leonhard Euler to describe the orientation of a rigid body with respect to a fixed coordinate system.Novi Commentarii academiae scientiarum Petropolitanae 20, 1776, pp. 189–207 (E478PDF/ref> The ...
s , , relative to a given base vector system . The corresponding matrix relative to this base vector system is (see Euler angles#Matrix orientation) \begin 0.771281 & -0.633718 & 0.059391 \\ 0.613092 & 0.714610 & -0.336824 \\ 0.171010 & 0.296198 & 0.939693 \end, and the quaternion is (0.171010, -0.030154, 0.336824, 0.925417). The canonical form of this operator \begin \cos\theta & -\sin\theta & 0\\ \sin\theta & \cos\theta & 0\\ 0 & 0 & 1 \end with is obtained with \hat e_3=(0.451272,-0.079571,0.888832). The quaternion relative to this new system is then (0, 0, 0.378951, 0.925417) = \left(0, 0, \sin\frac, \cos\frac\right). Instead of making the three Euler rotations 10°, 20°, 30° the same orientation can be reached with one single rotation of size 44.537° around .


References

* {{citation , title=An Introduction to the Theory of Linear Spaces, first=Georgi, last= Shilov, author-link = Georgii Evgen'evich Shilov , publisher= Prentice-Hall, year=1961, id=Library of Congress 61-13845. Linear algebra Kinematics Operator