Fact-checked by Grok 2 weeks ago

Orthogonal transformation

In linear algebra, an orthogonal transformation is a linear transformation T: V \to V on a real V that preserves the inner product, meaning \langle Tv, Tw \rangle = \langle v, w \rangle for all vectors v, w \in V. Equivalently, it preserves the Euclidean norm of vectors (\|Tv\| = \|v\|) and the angles between them, ensuring that distances and orientations are maintained under the mapping. When represented in a standard , an orthogonal transformation corresponds to an Q, a square matrix satisfying Q^T Q = I_n, where Q^T is the and I_n is the n \times n ; this implies Q^{-1} = Q^T. The columns (and rows) of Q form an for \mathbb{R}^n, and the of Q is either +1 (for proper rotations) or -1 (for improper rotations involving a ). Orthogonal transformations form the O(n), a under matrix multiplication, consisting of all such matrices; the subgroup of proper rotations is the special orthogonal group SO(n). Common examples include rotations in the plane, given by matrices like \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}, and reflections across a hyperplane, both of which preserve vector lengths and orthogonality. These transformations are fundamental in various fields, including physics for describing motions and coordinate changes, for rotations and reflections, and numerical methods like where the Q factor is orthogonal to stabilize computations. They also underpin techniques, such as the , by enabling efficient, norm-preserving representations of data.

Definition and Properties

Definition

An orthogonal transformation is a linear transformation T: V \to V on a finite-dimensional real V that satisfies \langle T(\mathbf{u}), T(\mathbf{v}) \rangle = \langle \mathbf{u}, \mathbf{v} \rangle for all \mathbf{u}, \mathbf{v} \in V. This preservation of the inner product is equivalent to the transformation preserving the Euclidean norm, since \|T(\mathbf{v})\|^2 = \langle T(\mathbf{v}), T(\mathbf{v}) \rangle = \langle \mathbf{v}, \mathbf{v} \rangle = \|\mathbf{v}\|^2 for all \mathbf{v} \in V. While the primary focus is on real inner product spaces, the concept extends to complex spaces through unitary transformations, which preserve the Hermitian inner product in an analogous manner. The term originated in the context of 19th-century , notably through the work of mathematicians such as , who employed such transformations in analyzing symmetric matrices. These transformations relate to isometries of Euclidean space, as preserving the inner product implies preserving distances between points.

Key Properties

Orthogonal transformations on a real inner product space preserve the inner product, which implies they preserve the Euclidean norm of vectors, ensuring that lengths remain unchanged under the transformation. This norm preservation directly leads to the invertibility of orthogonal transformations, as the preservation of lengths guarantees that the transformation is bijective and thus has an inverse. Specifically, the inverse of an orthogonal transformation T is its adjoint T^\dagger, satisfying T^{-1} = T^\dagger, where the adjoint is defined with respect to the inner product. A key algebraic consequence is the closure under composition: if S and T are orthogonal transformations, then their composition S \circ T is also orthogonal, as it preserves the inner product by the successive application of the preservation property. This composition property, combined with invertibility and the existence of the identity transformation (which is orthogonal), establishes that the set of all orthogonal transformations on an n-dimensional real forms a group under composition, known as the O(n). Furthermore, the preservation of the inner product implies that orthogonal transformations maintain angles between vectors. For any vectors \mathbf{u} and \mathbf{v}, the cosine of the angle \theta between T(\mathbf{u}) and T(\mathbf{v}) equals the cosine of the angle between \mathbf{u} and \mathbf{v}, given by \cos \theta = \frac{\langle T(\mathbf{u}), T(\mathbf{v}) \rangle}{\|T(\mathbf{u})\| \|T(\mathbf{v})\|} = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\|\mathbf{u}\| \|\mathbf{v}\|} due to the inner product preservation and norm invariance. This angle preservation underscores the isometry nature of orthogonal transformations in the algebraic framework.

Matrix Representation

Orthogonal Matrices

An orthogonal transformation on an n-dimensional real vector space can be represented by an n \times n real matrix Q in a chosen , where Q satisfies the condition Q^\top Q = I_n, with I_n denoting the n \times n . This defining property ensures that the matrix Q is invertible with inverse equal to its transpose, Q^{-1} = Q^\top. The columns of such a matrix Q form an orthonormal basis for \mathbb{R}^n, meaning each column vector has unit length and the columns are pairwise orthogonal; the rows satisfy the same orthonormality condition. This orthonormality directly corresponds to the transformation preserving lengths and angles in the basis representation. The determinant of an orthogonal matrix Q is either +1 or -1, reflecting whether the transformation is orientation-preserving (a proper rotation) or orientation-reversing (including a reflection). One explicit method to construct an is via the Gram-Schmidt process, which orthogonalizes and normalizes a given set of linearly independent vectors to produce an ; the resulting basis vectors can then serve as the columns (or rows) of Q.

Eigenvalues and Singular Value Decomposition

The eigenvalues of a real orthogonal matrix lie on the unit circle in the , meaning their absolute values are all equal to 1. This property arises because if \lambda is an eigenvalue with eigenvector v, then Qv = \lambda v implies \|Qv\| = |\lambda| \|v\|, and since Q preserves norms, |\lambda| = 1. For real orthogonal matrices, the eigenvalues are either real, in which case they must be \pm 1, or they occur in complex conjugate pairs e^{i\theta} and e^{-i\theta} for some real \theta. The real eigenvalues \pm 1 correspond to directions under the , while the complex pairs reflect rotational components in the . The () provides a for any real , and for a square Q, it simplifies significantly. Specifically, Q = U \Sigma V^T, where U and V are orthogonal matrices, and \Sigma is a with all entries equal to 1 (the singular values of Q). This form underscores that orthogonal matrices have full rank and unit singular values, aligning with their norm-preserving nature; in fact, one possible is Q = Q I I^T, highlighting the identity diagonal. A special case arises with proper orthogonal matrices, those with determinant 1, which form the special orthogonal group SO(n) and represent pure rotations without reflection. Their eigenvalues consist of \pm 1 (with even multiplicity for -1) and complex conjugate pairs on the unit circle, with 1 having multiplicity at least 1 in odd dimensions.

Geometric Aspects

Preservation of Inner Products

Orthogonal transformations, by definition, preserve the inner product in , meaning that for any T: \mathbb{R}^n \to \mathbb{R}^n satisfying \langle T\mathbf{u}, T\mathbf{v} \rangle = \langle \mathbf{u}, \mathbf{v} \rangle for all vectors \mathbf{u}, \mathbf{v} \in \mathbb{R}^n, the geometric structure induced by the inner product remains unchanged. This preservation directly implies that norms are maintained, as \|T\mathbf{u}\| = \sqrt{\langle T\mathbf{u}, T\mathbf{u} \rangle} = \sqrt{\langle \mathbf{u}, \mathbf{u} \rangle} = \|\mathbf{u}\|, ensuring that lengths of vectors are invariant under T. Consequently, distances between points are also preserved, establishing orthogonal transformations as isometries of : d(T\mathbf{u}, T\mathbf{v}) = \|\mathbf{u} - \mathbf{v}\| = d(\mathbf{u}, \mathbf{v}). The angle \theta between two vectors \mathbf{u} and \mathbf{v} is defined via \cos \theta = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\|\mathbf{u}\| \|\mathbf{v}\|}, so the invariance of the inner product and norms ensures that angles are preserved under orthogonal transformations. This extends to orthogonality: two vectors \mathbf{u} and \mathbf{v} are perpendicular if \langle \mathbf{u}, \mathbf{v} \rangle = 0, and thus T\mathbf{u} \perp T\mathbf{v} if and only if \mathbf{u} \perp \mathbf{v}, maintaining perpendicular relationships in the space. In \mathbb{R}^n, this property implies that orthogonal transformations map the unit sphere \{\mathbf{x} \in \mathbb{R}^n : \|\mathbf{x}\| = 1\} to itself, as every point on the sphere has unit norm, which is preserved. In contrast to general linear transformations, which can distort shapes by , shearing, or otherwise altering inner products, orthogonal transformations rigidly maintain the metric without such deformations. This geometric fidelity distinguishes them as the linear maps that respect the full structure of distances and angles in .

Rotations and Reflections

Orthogonal transformations are broadly classified into two categories based on their determinant: proper orthogonal transformations with determinant 1, known as rotations, which preserve the orientation of the space; and improper orthogonal transformations with determinant -1, known as reflections, which reverse orientation. Rotations represent orientation-preserving isometries that maintain angles between vectors, making them essential for describing rigid body motions without flipping. In two dimensions, a rotation by an angle \theta around the origin is represented by the orthogonal matrix \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}, which has determinant 1 and preserves the counterclockwise sense of angles. This matrix rotates any vector (x, y) to (\cos \theta \cdot x - \sin \theta \cdot y, \sin \theta \cdot x + \cos \theta \cdot y), exemplifying how rotations act as circular shifts in the plane. Reflections, in contrast, are orientation-reversing orthogonal , typically defined as reflections across a , where the transformation fixes points in the and negates the component along the normal direction. Such a reflection has eigenvalues of with multiplicity n-1 in n-dimensional space (for the directions) and a single eigenvalue of - corresponding to the unit normal vector to the . A specific form used in numerical computations is the Householder reflection, given by the matrix I - 2 \mathbf{u} \mathbf{u}^T, where \mathbf{u} is a normal to the and I is the ; this transformation is orthogonal with determinant - and is widely applied in algorithms like for its stability and efficiency. In three dimensions, rotations can be parameterized using , which constructs the from an axis-angle representation, or via unit quaternions, which offer a compact, singularity-free alternative for composing multiple rotations.

Applications

In Physics and Engineering

In , the special SO(3) plays a central role in modeling the rotational dynamics of , where elements of SO(3) represent proper rotations that preserve orientation and distances in three-dimensional . These transformations are essential for deriving the , such as Euler's rigid body equations, which describe how evolves under torque-free conditions. The rotational invariance of physical laws under SO(3) symmetries leads to the conservation of , as established by , ensuring that the total angular momentum vector remains constant for isolated systems without external torques. In , orthogonal transformations extend to their complex counterparts—unitary operators—which maintain the inner product and underpin both and operations. The operator U(t) = e^{-iHt/\hbar}, where H is the , is unitary, guaranteeing the preservation of probability norms as quantum states propagate. Unitary representations of SO(3) describe spatial symmetries, with operators as generators, leading to conserved quantities like total angular momentum in systems invariant under rotations. Orthogonal transformations are vital in signal processing through orthogonal wavelet transforms, which decompose signals into a multiresolution basis of for analysis and . The Haar wavelet transform, an early and computationally simple example, applies successive averaging and differencing operations to extract low-frequency approximations and high-frequency details, enabling efficient by thresholding insignificant coefficients while retaining perceptual quality in applications like and audio processing. In , orthogonal transformations facilitate coordinate alignments to represent end-effector orientations relative to the , often using rotation matrices from SO(3). parametrize these orientations via sequential rotations about body-fixed or fixed axes, providing an intuitive three-parameter description for manipulator control and path planning. However, exhibit singularities, such as , where alignment of rotation axes reduces the effective , necessitating alternative representations like quaternions for robust operation.

In Statistics and Data Analysis

Orthogonal transformations play a central role in (PCA), a multivariate statistical technique used to reduce the dimensionality of data while preserving variance. In PCA, the of the centered data is orthogonally diagonalized to obtain the principal components, which are the eigenvectors of this matrix; these components form an that transforms the original correlated variables into a new set of uncorrelated variables ordered by decreasing variance. This ensures that the transformed variables are independent in the sense of uncorrelatedness for multivariate distributions, facilitating the identification of the most informative directions in the data. The method was invented by in 1901 and further developed by in 1933, who named it , as a way to analyze complexes of statistical variables. The Karhunen-Loève transform (KLT), closely related to , provides an optimal for representing processes or random vectors, particularly in data compression applications. By diagonalizing the of the data ensemble, the KLT projects the signal onto its eigenvectors, which capture the maximum energy with the fewest components, minimizing the mean squared reconstruction error for a given number of retained basis functions. This transform is signal-dependent and achieves the theoretical lower bound on distortion for orthogonal expansions, making it ideal for compressing correlated data such as images or . The KLT was developed independently by Kari Karhunen in 1946 and Michel Loève in 1948, building on earlier work in . In hypothesis testing within analysis of variance (ANOVA), orthogonal contrasts are employed to partition the among group means into independent linear comparisons, ensuring that the tests for different hypotheses do not overlap in their use of the data. These contrasts, defined as linear combinations of group means with coefficients summing to zero and mutually orthogonal (i.e., their is zero), allow for the decomposition of the into non-overlapping components, maintaining the overall Type I error rate when performing multiple tests. For example, in a one-way ANOVA with multiple levels, orthogonal contrasts can test specific patterns like linear trends or comparisons between subsets of groups while preserving . This approach enhances the interpretability and power of post-hoc analyses in experimental designs. Orthogonal transformations also improve numerical stability in solving least squares problems through QR decomposition, where a matrix is factored into an orthogonal matrix Q and an upper triangular matrix R. In linear regression, applying QR decomposition to the design matrix avoids the ill-conditioning issues of the normal equations by preserving the Euclidean norm (since Q is orthogonal, ||Qx|| = ||x||), leading to more accurate solutions even for ill-conditioned systems. This method is particularly valuable in high-dimensional data analysis, where small perturbations in input can otherwise amplify errors exponentially. The stability benefits of QR-based least squares have been emphasized in numerical linear algebra literature since the 1960s.

References

  1. [1]
    Orthogonal Transformation -- from Wolfram MathWorld
    An orthogonal transformation is a linear transformation T:V->V which preserves a symmetric inner product. In particular, an orthogonal transformation ...
  2. [2]
    [PDF] ORTHOGONAL MATRICES Math 21b, O. Knill
    A linear transformation A is orthogonal if and only if the column vectors of A form an orthonormal basis. Proof. Look at AT A = 1n. Each entry is a dot product ...Missing: definition algebra
  3. [3]
    [PDF] R n → Rn is called an orthogonal transformation if f
    Rotations and Reflections are both orthogonal transformations since they both preserve the length of vectors and hence the angle between vectors. 1. Page 2. 2.
  4. [4]
    [PDF] 5.3 ORTHOGONAL TRANSFORMATIONS AND ORTHOGONAL ...
    Definition 5.3.1 Orthogonal transformations. and orthogonal matrices. A linear transformation T from R. n. to R.<|control11|><|separator|>
  5. [5]
    [PDF] Adjoint of Linear Map Orthogonal Transformations - NYU Courant
    Nov 16, 2022 · Orthogonal Transformation. ▷ A map F : V → V is orthogonal if for any v,w ∈ V,. hF(v),F(w)i = hv,wi. ▷ In other words, F preserves the inner ...
  6. [6]
    [PDF] LADR4e.pdf - Linear Algebra Done Right - Sheldon Axler
    Sheldon Axler received his undergraduate degree from Princeton University, followed by a PhD in mathematics from the University of California at Berkeley.
  7. [7]
    ORTHOGONAL TRANSFORMATIONS - ResearchGate
    3 This idea had already appeared in an 1829 paper by Cauchy[6] that Jacobi was familiar with. 4 While the concept of a matrix dates back to ancient China, ...
  8. [8]
    [PDF] Unit 8: The orthogonal group
    Feb 20, 2019 · Orthogonal transformations form a group with multiplication: Theorem: The composition and the inverse of two orthogonal transfor- mations is ...
  9. [9]
    [PDF] Quick Review of Matrix and Real Linear Algebra
    61 Corollary The composition of orthogonal transformations is an orthogonal transformation. An orthogonal transformation preserves angles (since it ...
  10. [10]
    Orthogonal Matrix -- from Wolfram MathWorld
    The rows of an orthogonal matrix are an orthonormal basis. That is, each row has length one, and are mutually perpendicular. Similarly, the columns are also an ...
  11. [11]
    Gram-Schmidt Orthonormalization -- from Wolfram MathWorld
    Gram-Schmidt orthogonalization, also called the Gram-Schmidt process, is a procedure which takes a nonorthogonal set of linearly independent functions and ...
  12. [12]
    None
    ### Summary of Eigenvalues of Orthogonal Matrices (Complex Ones for Real Matrices)
  13. [13]
    [PDF] Math 224 Properties of Orthogonal Matrices - Kenyon College
    The eigenvalues of an orthogonal matrix are always ±1. 17. If the eigenvalues of an orthogonal matrix are all real, then the eigenvalues are always ±1. 18 ...
  14. [14]
    [PDF] Orthogonal Matrices and the Singular Value Decomposition
    a matrix is orthogonal if its columns are orthonormal. Since ... Here is the main intuition captured by the Singular Value Decomposition (SVD) of a matrix:.
  15. [15]
    [PDF] RES.18-011 (Fall 2021) Lecture 12: Orthogonal Matrices
    41For example, the identity matrix is always orthogonal and has determinant 1, and the diagonal matrix with −1 in the first row and column and 1 down the ...
  16. [16]
    [PDF] Linear Algebra : Essence & Form - Penn Math
    Dec 28, 2024 · ... preserves inner product relationships with A. Definition 5.11 ... (Orthogonal Transformation). A linear transformation. T : V → V on an ...
  17. [17]
    [PDF] Discover Linear Algebra - Full-Time Faculty
    Jan 15, 2023 · a bijection that preserves inner product, i. e., for all v1,v2 ∈ V ... Definition 19.3.5 (Orthogonal transformation). Let V be a ...
  18. [18]
    [PDF] Chapter 9 Euclidean Spaces - CIS UPenn
    F is an orthogonal transformation, or a lin- ear isometry iff it is linear and kf(u)k = kuk, for all u 2 E. Thus, a linear isometry is a linear map that ...
  19. [19]
    [PDF] lecture 24: orthogonality and isometries
    For an inner product space, an isometry also preserves the inner product: <v, w> = <Lv, L w>. This is because of the polarization identities which relate ...
  20. [20]
    Rotation Matrix -- from Wolfram MathWorld
    We wish to place conditions on this matrix so that it is consistent with an orthogonal transformation (basically, a rotation or improper rotation). In a ...
  21. [21]
    [PDF] Max filtering with reflection groups - arXiv
    Dec 9, 2022 · A reflection is an orthogonal transformation with exactly one negative eigenvalue. Every nonzero vector u ∈ Rd determines a reflection with ...
  22. [22]
    Householder Matrix -- from Wolfram MathWorld
    Hx=x-(a+a^(H))v. (3). Lehoucq (1996) independently gave an interpretation that still uses the formula Hx=x-2av , but ...
  23. [23]
    Rodrigues' Rotation Formula -- from Wolfram MathWorld
    Rodrigues' rotation formula gives an efficient method for computing the rotation matrix R in SO(3) corresponding to a rotation by an angle theta about a fixed ...
  24. [24]
    [PDF] Chapter 2 Rigid Body Dynamics - MIT OpenCourseWare
    In the language of group theory, the restriction to det(W) = 1 gives the special or- thogonal group SO(3) as opposed to simply O(3), the orthogonal group.
  25. [25]
    17 Symmetry and Conservation Laws - Feynman Lectures - Caltech
    Symmetry with respect to rotations around the x-, y-, and z-axes implies the conservation of the x-, y-, and z-components of angular momentum. Symmetry with ...
  26. [26]
    [PDF] Time Evolution in Quantum Mechanics
    First we introduce the time evolution operator and define the Hamiltonian in terms of it. Then we discuss the evolution of state vectors and the Schrödinger ...
  27. [27]
    [PDF] 4 Transformations and Symmetries - DAMTP
    Wigner showed that all transformations are represented by either linear, unitary operators or anti-linear, anti-unitary ones.
  28. [28]
    [PDF] Chapter 3 Haar Bases, Haar Wavelets - UPenn CIS
    Wavelets play an important role in audio and video signal processing, especially for compressing long signals into much smaller ones than still retain enough ...
  29. [29]
    Coordinate Transformations in Robotics - MATLAB & Simulink
    Euler angles are three angles that describe the orientation of a rigid body. Each angle is a scalar rotation around a given coordinate frame axis. The Robotics ...
  30. [30]
    Analysis of a complex of statistical variables into principal components.
    Analysis of a complex of statistical variables into principal components. · H. Hotelling · Published 1933 · Education, Psychology · Journal of Educational ...
  31. [31]
    ECG data compression with the Karhunen-Loeve transform
    The authors analyze a Karhunen-Loeve transform technique for ECG data compression. This transform has been, applied in two different ways: to the entire ...
  32. [32]
    8.7 - Constructing Orthogonal Contrasts | STAT 505
    So contrasts A and B are orthogonal. Similar computations can be carried out to confirm that all remaining pairs of contrasts are orthogonal to one another.
  33. [33]
    [PDF] numerically efficient methods for solving least squares problems
    Aug 24, 2012 · The greatest advantage of using orthogonal transformations is in its numerical stability: if Q is orthogonal then κ(Q) = 1. It is also clear ...