Fact-checked by Grok 2 weeks ago

Exchange matrix

In linear algebra, the exchange matrix (also known as the reversal matrix, backward identity, or standard involutory ) is a square matrix of order n with ones along the main anti-diagonal (from the top-right to the bottom-left corner) and zeros elsewhere. For example, the 3×3 exchange matrix J is given by J = \begin{pmatrix} 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \end{pmatrix}. This matrix reverses the order of the entries in any it multiplies (on the left or right, due to its ), effectively permuting the components from first to last. The exchange matrix possesses several key properties that make it fundamental in matrix analysis. It is symmetric (J = J^T), orthogonal (J^T J = I), and an (J^2 = I), meaning it is its own and . These attributes ensure that multiplying by J is a reversible without , preserving norms and inner products in vector spaces. Additionally, a matrix A is centrosymmetric if it commutes with J under conjugation, i.e., J A J = A, which implies with respect to the center of the matrix. Exchange matrices play a crucial role in various applications, including the study of symmetric and persymmetric matrices, where conjugation with J transforms or symmetrizes structures like Hankel or Toeplitz matrices. They also appear in numerical algorithms for , eigenvalue computations, and the analysis of structured matrices in and , facilitating efficient reversals and permutations without full recomputation.

Definition and Examples

Formal Definition

The n \times n exchange matrix J_n, also known as the reversal matrix, is a specific whose entries are given by J_{i,j} = \delta_{i, n+1-j}, where \delta denotes the ; this configuration places 1s exclusively along the anti-diagonal and 0s elsewhere. Equivalently, J_n is the with its rows (or columns) reversed in order. As a permutation matrix, J_n represents the reversal permutation \sigma: \{1, \dots, n\} \to \{1, \dots, n\} defined by \sigma(k) = n+1-k, which acts on the standard basis by mapping the vector e_k to e_{n+1-k}. When applied to an arbitrary matrix A, premultiplication by J_n reverses the order of the rows of A, while postmultiplication A J_n reverses the order of the columns of A.

Examples in Low Dimensions

In low dimensions, the exchange matrix provides concrete illustrations of its role as a reversal operator on basis vectors. For dimension n=2, the exchange matrix J_2 is given by J_2 = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}, which swaps the vectors e_1 = (1, 0)^T and e_2 = (0, 1)^T, as J_2 e_1 = e_2 and J_2 e_2 = e_1. This structure places 1s on the anti-diagonal, reflecting the general form where entries are 1 only when the row index plus column index equals n+1. For dimension n=3, the exchange matrix J_3 takes the form J_3 = \begin{pmatrix} 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \end{pmatrix}, reversing the order of the vectors such that J_3 e_1 = e_3, J_3 e_2 = e_2, and J_3 e_3 = e_1. This anti-diagonal pattern emerges consistently, positioning 1s to permute coordinates in reverse. To verify its reversal effect, consider the action on a sample v = (1, 2, 3)^T: multiplying yields J_3 v = (3, 2, 1)^T, directly exchanging the first and third components while leaving the middle unchanged.

Algebraic Properties

Symmetry and Involutory Nature

The exchange matrix J_n, also known as the reversal matrix, is symmetric, satisfying J_n^T = J_n. This arises from its structure, where the entries are 1 on the anti-diagonal and 0 elsewhere, such that (J_n)_{i,j} = \delta_{i, n+1-j}. Transposing the matrix reflects it across the , but due to the anti-diagonal placement of the 1s, the result remains unchanged, preserving the original form. Furthermore, J_n is involutory, meaning J_n^2 = I_n, where I_n is the n \times n . This follows from the reversal it induces: multiplying a by J_n reverses its components, and applying the reversal again restores the original order. To see this algebraically, the (i,j)-th entry of J_n^2 is \sum_{k=1}^n (J_n)_{i,k} (J_n)_{k,j} = \sum_{k=1}^n \delta_{i, n+1-k} \delta_{k, n+1-j}, which simplifies to \delta_{i,j} because the sum picks out k = n+1-j only when i = j. This confirms that double reversal yields the identity. As a symmetric involutory matrix, J_n is orthogonal, satisfying J_n J_n^T = I_n. Substituting the symmetry gives J_n J_n = I_n, which aligns directly with the involutory property, implying that J_n preserves the Euclidean norm of vectors under multiplication. This orthogonality underscores its role as a permutation matrix corresponding to operation.

Powers and Inverse

The exchange matrix J_n, being the permutation matrix associated with the reversal permutation \sigma(i) = n+1-i, satisfies J_n^2 = I_n, where I_n is the n \times n . This relation holds because composing the reversal permutation with itself yields the identity permutation, as reversing the order twice restores the original sequence; explicitly, the (i,j)-entry of J_n^2 is 1 if and only if j = n+1-i and the corresponding entry in the second J_n maps back to i, resulting in the . Consequently, the powers of J_n exhibit a simple periodicity: J_n^k = I_n for even positive integers k, and J_n^k = J_n for odd positive integers k. This order-2 cycling distinguishes the exchange matrix from permutation matrices corresponding to higher-order permutations, such as cycles of length greater than 2, whose powers require more steps to return to the identity. The self-inverse property follows directly from the squared identity: J_n^{-1} = J_n, confirming that J_n is an involution in the group of invertible matrices. For verification in general dimension n, one may compute the product J_n J_n entrywise, noting that the anti-diagonal structure ensures each standard basis vector e_i is mapped to e_{n+1-i} and then back to e_i.

Spectral Properties

Eigenvalues and Eigenvectors

The exchange matrix J_n \in \mathbb{R}^{n \times n}, defined as the corresponding to the reversal permutation \sigma(i) = n+1-i, possesses eigenvalues solely of +[1](/page/1) and -[1](/page/−1). The algebraic multiplicity of +1 is \lceil n/2 \rceil, while that of -1 is \lfloor n/2 \rfloor. These values arise from the cycle decomposition of the reversal permutation, which consists of \lfloor n/2 \rfloor disjoint 2-cycles (each pairing indices i and n+1-i for i = 1, \dots, \lfloor n/2 \rfloor) and, when n is odd, one additional 1-cycle at the central index (n+1)/2. For a , the eigenvalues are the roots of unity matching the lengths of its cycles: a 2-cycle yields eigenvalues +1 and -1, while a 1-cycle yields +1. Thus, the \lfloor n/2 \rfloor 2-cycles contribute \lfloor n/2 \rfloor instances each of +1 and -1, augmented by an extra +1 for odd n. The corresponding eigenspaces reflect the symmetry properties induced by the reversal action. The eigenspace for eigenvalue +1 comprises all palindromic vectors \mathbf{v} satisfying v_i = v_{n+1-i} for i = 1, \dots, n, forming a subspace of dimension \lceil n/2 \rceil (spanned by basis vectors where the first \lceil n/2 \rceil components are chosen freely and the remainder are mirrored). The eigenspace for eigenvalue -1 comprises all skew-palindromic (or anti-palindromic) vectors satisfying v_i = -v_{n+1-i} for i = 1, \dots, n (with the central component zero if n is odd), forming a subspace of dimension \lfloor n/2 \rfloor. These characterizations follow from the fact that J_n is symmetric and centrosymmetric (commuting with itself), enabling a decomposition of the space into symmetric and skew-symmetric subspaces with respect to reversal. For illustration with n=3, the exchange matrix is J_3 = \begin{pmatrix} 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \end{pmatrix}. To find the eigenvectors, solve (J_3 - \lambda I_3) \mathbf{v} = \mathbf{0} for each eigenvalue. For \lambda = -1, the equation J_3 \mathbf{v} = -\mathbf{v} yields v_3 = -v_1, v_2 = -v_2 (implying v_2 = 0), and v_1 = -v_3, so a basis vector is \begin{pmatrix} 1 \\ 0 \\ -1 \end{pmatrix} (up to scaling); direct verification confirms J_3 \begin{pmatrix} 1 \\ 0 \\ -1 \end{pmatrix} = \begin{pmatrix} -1 \\ 0 \\ 1 \end{pmatrix} = -\begin{pmatrix} 1 \\ 0 \\ -1 \end{pmatrix}. For \lambda = +1 (multiplicity 2), the equation J_3 \mathbf{v} = \mathbf{v} yields v_3 = v_1, v_2 = v_2, and v_1 = v_3, so the eigenspace is all vectors of the form \begin{pmatrix} a \\ b \\ a \end{pmatrix} for scalars a, b; basis vectors are \begin{pmatrix} 1 \\ 0 \\ 1 \end{pmatrix} and \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix} (or, equivalently, \begin{pmatrix} 1 \\ 2 \\ 1 \end{pmatrix} as a non-basis example, satisfying J_3 \begin{pmatrix} 1 \\ 2 \\ 1 \end{pmatrix} = \begin{pmatrix} 1 \\ 2 \\ 1 \end{pmatrix}).

Trace and Determinant

The of the exchange matrix J_n, defined as the sum of its diagonal entries, equals 1 when n is due to the single 1 at the central position i = (n+1)/2 where the row and anti-diagonal indices coincide, and equals 0 when n is even as no diagonal entries are 1 in that case. The of J_n is (-1)^{\lfloor n/2 \rfloor}, which follows from viewing J_n as the corresponding to the \sigma(i) = n+1-i; the sign of this permutation is determined by the parity of the number of inversions, equal to \binom{n}{2} = n(n-1)/2, equivalent modulo 2 to \lfloor n/2 \rfloor. Alternatively, this can be computed via cofactor along the first row, yielding a recurrence that confirms the closed form. These scalars align with the spectral properties of J_n: the trace equals the sum of the eigenvalues (with multiplicity), while the determinant equals their product, providing invariants that reflect the matrix's involutory and permutation structure.

Relationships to Other Concepts

As a Permutation Matrix

The exchange matrix J_n, also known as the reversal matrix, belongs to the class of permutation matrices and specifically represents the reversal permutation \sigma, defined by \sigma(i) = n + 1 - i for i = 1, 2, \dots, n. Like all matrices, J_n is a square matrix with exactly one 1 in each row and each column, and zeros elsewhere; the entry (J_n)_{i,j} = 1 j = n + 1 - i, ensuring it permutes the standard basis vectors by reversing their order. The reversal permutation \sigma can be expressed as a product of \lfloor n/2 \rfloor disjoint , each swapping positions i and n + 1 - i for i = 1 to \lfloor n/2 \rfloor. Since each transposition has odd , the overall sign of \sigma (and thus the of J_n) is (-1)^{\lfloor n/2 \rfloor}, making it even when \lfloor n/2 \rfloor is even and odd otherwise. In contrast to general permutation matrices, which may correspond to arbitrary bijections and often involve longer cycles, the reversal permutation is an , satisfying \sigma^2 = \mathrm{id} (the identity permutation) and hence J_n^2 = I_n. This property arises directly from the reversal action, as applying it twice restores the original order. Regarding fixed points, \sigma has none when n is even, but for odd n, there is exactly one fixed point at the middle index i = (n+1)/2, where \sigma(i) = i.

Connections to Symmetric Matrix Classes

The exchange matrix J_n, also known as the or counteridentity matrix, serves as a fundamental tool for characterizing various classes of that exhibit reversal-based symmetries. These classes arise from specific commutation or conjugation relations involving J_n, which enforce structural properties invariant under row or column reversals. Such matrices are prevalent in and applications requiring symmetry exploitation for efficient computation. Centrosymmetric matrices are defined as square matrices A satisfying J_n A J_n = A, a condition that implies symmetry under a 180-degree rotation about the matrix center, or equivalently, a_{i,j} = a_{n+1-i, n+1-j} for all entries. This conjugation with J_n preserves the matrix structure, making centrosymmetric matrices closed under inversion and multiplication when applicable. The exchange matrix thus generates this class by enforcing bilateral reversal invariance, which simplifies eigenvalue computations and decompositions in structured problems. Persymmetric matrices, in contrast, satisfy A J_n = J_n A^T, reflecting symmetry across the anti-diagonal or row-reversal invariance, where a_{i,j} = a_{n+1-j, n+1-i}. This relation highlights the role of J_n in transposing while reversing, a property that ensures the class is preserved under inversion for nonsingular cases. Persymmetric structures often intersect with Toeplitz matrices, aiding in banded or constant-diagonal analyses. Bisymmetric matrices combine centrosymmetry and persymmetry, satisfying both J_n A J_n = A and A J_n = J_n A^T, or equivalently for symmetric A, A = A^T and J_n A = A J_n. This commutation property with J_n implies symmetry about both main and anti-diagonals, forming a subclass where the exchange matrix acts as a symmetry operator. The exchange matrix's involutory nature (J_n^2 = I) ensures these classes are well-defined and algebraically tractable, as seen in bisymmetric Toeplitz matrices used in signal processing for symmetric filter design.

Applications

Row and Column Operations

The exchange matrix J_n, also known as the reversal matrix, facilitates row and column reversals in linear algebra computations by pre- or post-multiplying a given matrix, effectively reordering elements along the anti-diagonal. When J_n is multiplied on the left of a matrix A, it reverses the rows of A; right multiplication reverses the columns. This operation is particularly useful in algorithms requiring standardized matrix forms without altering the underlying linear structure. In , J_n functions as an elementary-like matrix to achieve full row reversal, equivalent to a sequence of pairwise row interchanges, although standard implementations prefer individual swaps for numerical stability during pivoting to handle zero or small pivots. The determinant of J_n, given by \det(J_n) = (-1)^{n(n-1)/2}, reflects the parity of this sequence of swaps and thus influences the sign change in the overall determinant computation when reversals are applied. For broader matrix manipulations, the exchange matrix enables transformations to symmetrize or standardize structured forms, such as converting a T to a via J_n T J_n, which flips the constant diagonals to constant anti-diagonals and is commonly employed in algorithms for structured linear systems. This reversal aids in exploiting symmetries for efficient solving of systems involving Toeplitz-plus-Hankel matrices. Computationally, multiplying a matrix by J_n to perform reversal incurs an O(n^2) complexity due to the permutation structure, allowing efficient implementation in numerical software for large-scale linear algebra tasks without excessive overhead.

Signal Processing and Reversal

In digital signal processing, the exchange matrix J_n facilitates time reversal of a discrete-time signal represented as an n-dimensional vector \mathbf{x}, yielding J_n \mathbf{x} where the components are reordered such that the k-th element becomes x(n-1-k) for k = 0, 1, \dots, n-1. This operation implements the signal transformation y(k) = x(-k) in discrete time, preserving the signal's while flipping its temporal around the origin. It is particularly valuable in applications requiring , such as the design of linear-phase filters, where J_n enforces the antisymmetric or symmetric properties needed for zero-phase distortion. Additionally, in computing autocorrelation functions for stationary processes, J_n relates the forward and backward prediction error filters, enabling efficient estimation of the autocorrelation 's Toeplitz structure via manipulations like R_{xx} = J_n R_{xx} J_n, where R_{xx} is the signal's autocorrelation . The exchange matrix extends to polynomial processing, where it reverses the coefficients of a p(z) = \sum_{k=0}^{n-1} a_k z^k represented by the vector \mathbf{a} = [a_0, a_1, \dots, a_{n-1}]^T, producing J_n \mathbf{a} and transforming p(z) into the reversed form z^{n-1} p(1/z) = \sum_{k=0}^{n-1} a_{n-1-k} z^k. This reversal is essential in polynomial matrix theory for tasks like stabilizing control systems or analyzing reciprocal polynomials in filter design. In eigenvalue problems for matrix polynomials, the reversal operation via J_n helps identify finite and infinite eigenvalues symmetrically, as seen in the definition of the reversal matrix polynomial P^R(\lambda) = \lambda^{\deg P} P(1/\lambda), which aids in constructing canonical forms. For the two-dimensional case, the exchange matrix J_2 equals the Pauli X matrix \sigma_x = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}, a building block in and processing. A representative application is matched filtering in discrete-time or communications systems, where the optimal receiver filter for a known signal \mathbf{s} uses the time-reversed conjugate h = J_n \mathbf{s}^* (for real signals, simply J_n \mathbf{s}), correlating the received signal with this reversed template to achieve maximum at the decision instant. This reversal ensures the filter's impulse response aligns the signal's energy peak with the output sampling time, as demonstrated in simulations where it improves detection probability by focusing correlated components.

References

  1. [1]
    [PDF] arXiv:1912.11710v2 [math.CO] 10 Jun 2020
    Jun 10, 2020 · This will allow in particular the mul- tiplication of any square matrix of order n with the reversal matrix (also called the exchange matrix) ...
  2. [2]
    Linear Algebra Glossary - UC Davis Math
    Feb 15, 2012 · Linear Algebra Glossary. This file defines common terms from linear ... The exchange matrix J is constructed from the identity matrix by ...<|control11|><|separator|>
  3. [3]
    Definition and Properties of a Vector-Matrix Reversal ... - Scirp.org.
    Invariance, Reversal Matrix, Linear Algebra, Programming Techniques. 1. Introduction ... Linear Algebra. ... [15] Exchange Matrix. https://en.wikipedia.org ...
  4. [4]
    [PDF] 587 A NOTE ON SPECIAL MATRICES Roselin Antony1
    3.5.5 Exchange matrix. The exchange matrix is an anti-diagonal matrix in which all the entries in the anti-diagonal are 1 and all other elements are zero. It ...
  5. [5]
    [PDF] arXiv:2304.13842v1 [math.RA] 26 Apr 2023
    Apr 26, 2023 · Definition 3.2 (Exchange Matrix) The exchange matrix En of size n is the antidiagonal matrix of size n whose antidiagonal consists of 1s. [45].
  6. [6]
    [PDF] Chapter 33 - Matrices - DSP-Book
    The scalar dimension of . 33.1.12 Reflection (or exchange) Matrix J reverses the rows or columns of a matrix. Example. = reversed the rows. = reversed the ...
  7. [7]
    [PDF] 1.4 Matrix Multiplication AB and CR - MIT Mathematics
    The factorization A = CR is a big step in linear algebra. The Problem Set will look closely at the matrix R, its form is remarkable. R has the identity matrix ...
  8. [8]
    [PDF] Linear Algebra - Columbia Math Department
    Jul 10, 2015 · ... reversal matrix, since it simply reverses the order of the variables. It is symmetric and orthogonal, so Pt = P−1. Then. Jt r = PrJrP. −1 r ...
  9. [9]
    (PDF) Linear Algebra of Magic Squares - Academia.edu
    ... reversal matrix. Also, observe that J T = J and J 2 = I. Thus J is its own inverse. Using the reversal matrix J, the condition for regularity of a magic ...
  10. [10]
    The distribution of eigenvalues of randomized permutation matrices
    May 3, 2010 · ... eigenvalues can be very explicitly computed by using the cycle structure of the permutations. Moreover, by using the so-called virtual ...
  11. [11]
    Determinant of the identity matrix with columns in reverse order
    Feb 20, 2016 · linear-algebra · matrices ... Determinant of the n×n exchange matrix · 1 · Find determinant of reversal matrix using permutation similarity ...
  12. [12]
    None
    Below is a merged summary of Section 1.2.11 on the "Exchange Matrix" from "Matrix Computations" (4th Ed.), consolidating all information from the provided segments. Since the content varies across sources and some segments lack explicit references to Section 1.2.11 or the exchange matrix, I’ve organized the information into a comprehensive table to retain all details efficiently. The table captures definitions, properties, and additional notes, while text follows to summarize key points and provide useful URLs.
  13. [13]
    Matrix Reference Manual: Special Matrices - Imperial College London
    A[n#n] is bisymmetric if it is symmetric about both main diagonals, i.e. if A=AT=JAJ where J is the exchange matrix. WARNING: The term persymmetric is sometimes ...
  14. [14]
    Classroom Note:Centrosymmetric Matrices | SIAM Review
    Cantoni and P. Butler, Eigenvalues and eigenvectors of symmetric centrosymmetric matrices, Linear Algebra Appl., 13 (1976), pp. 275–288. Crossref · Web of ...
  15. [15]
    Some properties of generalized K-centrosymmetric H-matrices
    A matrix A is said to be (skew-)centrosymmetric if A = JAJ ( A = - JAJ ), where J is the exchange matrix with ones on the anti-diagonal (lower left to upper ...
  16. [16]
    Some Eigenvalue Properties of Persymmetric Matrices | SIAM Review
    This note shows some useful eigenvalue and eigenvector properties of matrices with two symmetries, such as matrices which are symmetric and persymmetric.Missing: connections | Show results with:connections<|control11|><|separator|>
  17. [17]
    [PDF] arXiv:1301.0746v1 [math-ph] 4 Jan 2013
    Jan 4, 2013 · Note that a matrix with the property JAJ = A is called centrosymmetric. There- fore, symmetric persymmetric or symmetric centrosymmetric are the ...
  18. [18]
    [PDF] MATRIX OPERATORS AND THE KLEIN FOUR GROUP
    The trace of this matrix allows to find a relation among the dyadic ... exchange matrix and the Kronecker product. Our building blocks are the two ...
  19. [19]
  20. [20]
    [PDF] fast iterative methods for solving toeplitz-plus-Hankel least squares ...
    By transforming the Hankel matrix Hn to a Toeplitz matrix using the reversal matrix Jn, the Hankel matrix-vector products Hnu can be computed by using FFT in O ...Missing: converting | Show results with:converting
  21. [21]
    [PDF] Fast Algorithms for Toeplitz and Hankel Matrices - TU Chemnitz
    Abstract. The paper gives a self-contained survey of fast algorithms for solving linear systems of equations with Toeplitz or Hankel coefficient matrices.
  22. [22]
    [PDF] A Breakdown Free Numerical Algorithm for Inverting General ... - arXiv
    Aug 30, 2022 · The computational complexity of the algorithms given in [8, 12] is O(n2). In the applied ... (1) The reversal matrix Jn defined by: Jn ...
  23. [23]
    [PDF] Paraunitary Filter Banks Over Finite Fields - Caltech Authors
    J denotes the reversal matrix. For example, the 4 x 4 reversal matrix is. YO ... SIGNAL PROCESSING, for which the first author (T. Nguyen) received the ...
  24. [24]
    [PDF] Minimum Mean-Square Error Filtering: Autocorrelation/Covariance ...
    Apr 27, 2011 · Since JJ = I, then the inverse of the exchange matrix is. J−1 = J. Also the exchange matrix is both symmetric and Hermitian symmetric, J = JH ...
  25. [25]
    [PDF] Matrix polynomials with completely prescribed eigenstructure
    Matrix polynomials may have infinity as an eigenvalue. Its definition is based on the so-called reversal matrix polynomial [21]. Definition 2.4. Let P(λ) =.
  26. [26]
    Matched Filtering - MATLAB & Simulink - MathWorks
    The matched filter is a time-reversed and conjugated version of the signal. The matched filter is shifted to be causal.