Fact-checked by Grok 2 weeks ago

Commuting matrices

In linear algebra, two square matrices A and B over a (typically the real or numbers) are said to commute if AB = BA, meaning the order of does not affect the result. This property is expressed using the [A, B] = AB - BA = 0. While is generally non-commutative—unlike —the commuting case is central to many advanced results in the field. A key theorem states that if A and B are both diagonalizable and commute, then there exists a single S such that S^{-1}AS and S^{-1}BS are both diagonal, allowing them to share a common eigenbasis. This simultaneous simplifies computations and reveals structural similarities between the matrices. For (Hermitian) matrices, which arise frequently in applications like and optimization, commutativity implies simultaneous via a unitary , preserving inner products and . In , commuting Hermitian operators represent compatible observables that can be measured simultaneously with definite outcomes, as they share common eigenvectors. More broadly, over the complex numbers, a family of commuting matrices is simultaneously upper triangularizable, with further implications for representations of algebras, Jordan canonical forms, and stability analysis in dynamical systems.

Fundamentals

Definition

In linear algebra, matrix multiplication is generally non-commutative, meaning that for arbitrary square matrices A and B of the same order n \times n, the product AB does not necessarily equal BA. Two such matrices A, B \in M_n(\mathbb{F}), where \mathbb{F} is a field (such as the real or complex numbers), are said to commute if AB = BA. This condition is equivalently expressed through the [A, B] = AB - BA = 0. The concept applies specifically to square matrices of the same size, since requires compatible dimensions for both AB and BA to be defined and comparable; non-square matrices do not admit a standard commutativity relation in this sense.

Basic Properties

The set of all matrices that commute with a fixed matrix A \in M_n(\mathbb{F}), denoted the centralizer C(A) = \{ B \in M_n(\mathbb{F}) \mid AB = BA \}, forms a subalgebra of the full matrix algebra M_n(\mathbb{F}) over the field \mathbb{F}. This subalgebra is closed under addition and scalar multiplication, as if B, C \in C(A) and \alpha, \beta \in \mathbb{F}, then A(\alpha B + \beta C) = \alpha AB + \beta AC = \alpha BA + \beta CA = (\alpha B + \beta C)A. It is also closed under matrix multiplication, since if B, C \in C(A), then A(BC) = (AB)C = B(AC) = B(CA) = (BC)A. The I commutes with every matrix A, as IA = AI = A, so I \in C(A) for all A. Similarly, every scalar multiple of the , cI for c \in \mathbb{F}, commutes with A because (cI)A = cA = A(cI). Consequently, the centralizer of the is the entire matrix algebra: C(I) = M_n(\mathbb{F}). A trivial but illustrative case arises with diagonal matrices: any two diagonal matrices D_1, D_2 \in D_n(\mathbb{F}) (the of n \times n diagonal matrices over \mathbb{F}) commute, as their product is entrywise of diagonals, which is commutative. In fact, the centralizer of a with distinct diagonal entries is precisely the set of diagonal matrices. While commuting matrices preserve each other's eigenspaces—for if Av = \lambda v, then A(Bv) = B(Av) = \lambda (Bv), so Bv is also an eigenvector with eigenvalue \lambda unless Bv = 0—they do not necessarily share a complete set of common eigenvectors. For instance, the commutes with every but has the full space as its eigenspace, whereas a non-diagonalizable like a Jordan block has a proper eigenspace of dimension less than n.

Advanced Properties

Commuting Families

A commuting family of matrices is a finite or infinite collection \{A_1, A_2, \dots, A_k\} of matrices in M_n(\mathbb{F}), where \mathbb{F} is a , such that A_i A_j = A_j A_i for all i, j. These families extend the notion of pairwise commutativity to multiple matrices and play a key role in understanding the structure of commutative substructures within the full matrix algebra M_n(\mathbb{F}). The centralizer of a single matrix A \in M_n(\mathbb{F}), denoted Z(A), is the subalgebra consisting of all matrices B \in M_n(\mathbb{F}) that with A, i.e., Z(A) = \{B \in M_n(\mathbb{F}) \mid AB = BA\}. This set forms an associative algebra under matrix addition and multiplication, and any commuting family containing A is contained in Z(A). For n \times n matrices over an algebraically closed , the dimension of Z(A) satisfies \dim Z(A) \geq n, with equality if and only if A is a cyclic matrix (i.e., the minimal polynomial of A has degree n). In this minimal case, Z(A) coincides with the algebra of all polynomials in A. A commuting family \{A_1, \dots, A_k\} generates a commutative of M_n(\mathbb{F}), meaning the \mathbb{F}-span of all products of the A_i (including the ) is abelian under . Such subalgebras are of particular interest because their structure reflects the joint properties of the family. For instance, if the matrices are diagonalizable over \mathbb{C}, the family admits simultaneous . A canonical example of a commuting family is the set of all diagonal n \times n matrices over \mathbb{F}, which pairwise commute since diagonal matrices commute elementwise. This family generates a commutative subalgebra of dimension n.

Simultaneous Diagonalization and Triangularization

A commuting family of diagonalizable matrices over the complex numbers can be simultaneously diagonalized via a single invertible similarity transformation. Specifically, if \{A_\alpha\} \subset M_n(\mathbb{C}) is a family of pairwise commuting diagonalizable matrices, there exists an invertible matrix S \in M_n(\mathbb{C}) such that S^{-1} A_\alpha S is diagonal for every \alpha. This result follows from the fact that such families share a common eigenbasis, allowing a unified diagonal form. For the special case of normal matrices, which are unitarily diagonalizable individually, the theorem strengthens: a family of commuting normal matrices is simultaneously unitarily diagonalizable, meaning there exists a unitary U \in M_n(\mathbb{C}) such that U^* A_\alpha U is diagonal for all \alpha. This is a direct extension of Schur's theorem to families and underscores the role of normality in preserving unitarity. More generally, any family of pairwise commuting complex matrices admits simultaneous upper triangularization, regardless of diagonalizability. That is, for commuting A, B \in M_n(\mathbb{C}), there exists an invertible P \in M_n(\mathbb{C}) such that both P^{-1} A P and P^{-1} B P are upper triangular. This holds for arbitrary finite or infinite commuting families \{A_\alpha\} \subset M_n(\mathbb{C}), where a single invertible P (or equivalently, a unitary U via equivalence of similarity and unitary triangularization) renders all P^{-1} A_\alpha P upper triangular. The diagonal entries in this form correspond to the eigenvalues, ordered compatibly across the family, reflecting shared spectral properties such as common eigenspaces. A deeper algebraic condition governs simultaneous diagonalizability: a commuting family is simultaneously diagonalizable if and only if the associative algebra it generates is semisimple. In this context, the generated algebra is commutative due to pairwise commutativity, and semisimplicity ensures decomposition into a direct sum of simple components (fields over \mathbb{C}), enabling a basis of common eigenvectors. However, not all commuting families satisfy this; for instance, a family consisting of two commuting nilpotent Jordan blocks of size greater than 1 generates a non-semisimple algebra and cannot be simultaneously diagonalized, though it remains simultaneously triangularizable. Such limitations highlight the distinction between triangular and diagonal forms in non-diagonalizable cases.

Characterizations

Algebraic Characterizations

Algebraic characterizations of commuting matrices focus on conditions expressed through identities, algebraic structures, and properties within the of matrices. A fundamental result states that if two n \times n matrices A and B over an commute (i.e., AB = BA), and A is non-derogatory—meaning its minimal has n, equal to its —then B can be expressed as a in A of at most n-1. This characterization highlights how commutativity restricts the centralizer of A to the generated by A itself when A achieves the maximal possible for its minimal . The converse also holds: if B = p(A) for some p, then AB = BA trivially, as polynomials in A commute with A. This equivalence is central to understanding the structure of commutative subalgebras in matrix s. A related perspective arises from the generated by A and B. If A and B commute, the they generate consists of all linear combinations and products, which forms a commutative . In particular, Weyl's work on the structure of such algebras implies that commuting matrices satisfy relations within this generated algebra, ensuring that elements derived from A and B remain interdependent through non-commutative extensions only if the original pair does not commute. This algebraic interdependence underscores that commutativity is equivalent to the generated algebra being commutative, without higher-degree relations beyond polynomials in the generators under the non-derogatory assumption. From a viewpoint, the bracket [\cdot, \cdot] defines the Lie algebra \mathfrak{gl}(n) of n \times n matrices. If A and B commute, the Lie generated by A and B—spanned by linear combinations xA + yB for scalars x, y—is abelian, as [xA + yB, x'A + y'B] = (xy' - yx')[A, B] = 0 for all scalars. This abelianity is a direct algebraic consequence of [A, B] = 0 and characterizes pairs where the generated Lie structure has vanishing brackets, distinguishing them from non-abelian . Seminal treatments emphasize this as a foundational property linking matrix commutativity to broader . An additional algebraic condition involves traces: if A and B commute, then \operatorname{tr}(A^k B^m) = \operatorname{tr}(B^m A^k) for all non-negative integers k, m, due to the cyclic property of the trace, which allows reordering A^k B^m = B^m A^k within the trace. This equality holds as a necessary consequence of commutativity but is not sufficient, as there exist non-commuting pairs satisfying it for specific k, m or even all powers in low dimensions, though counterexamples abound in general. This trace-based characterization provides a verifiable algebraic test, albeit incomplete, for potential commutativity. Finally, a variant of Frobenius's theorem offers another algebraic lens: if A and B commute and share no common invariant (under certain irreducibility conditions on the pair), then the entire is irreducible for the they generate, implying shared s only when reducibility occurs. More precisely, commutativity ensures that any for A is also invariant for B, algebraically captured by the joint action preserving the same lattice of subspaces. This result, rooted in early , algebraically ties commutativity to the coincidence of invariant subspace structures without invoking spectral data.

Spectral and Geometric Characterizations

Commuting matrices exhibit profound connections to , particularly through their interactions with eigenvalues and eigenspaces. A fundamental property arises when considering an eigenvector of one matrix under the action of a commuting partner. Suppose A and B are matrices satisfying AB = BA, and let v be an eigenvector of A with eigenvalue \lambda, so Av = \lambda v. Then, A(Bv) = B(Av) = B(\lambda v) = \lambda (Bv), which implies that Bv is also an eigenvector of A corresponding to the same eigenvalue \lambda (or zero if Bv = 0). This demonstrates that B maps the eigenspace of A for \lambda into itself, preserving the eigenspace. More generally, if A and B commute, each preserves the eigenspaces of the other. That is, the eigenspace E_\lambda(A) = \{ v \mid Av = \lambda v \} is invariant under B, meaning B(E_\lambda(A)) \subseteq E_\lambda(A). If A and B are both diagonalizable, then they are simultaneously diagonalizable, admitting a common eigenbasis in which both are diagonal. From a geometric , commutativity of linear s on a implies the existence of joint subspaces. The eigenspaces of one operator serve as invariant subspaces for the other, allowing the operators to act compatibly on these subspaces. This joint invariance facilitates the of the into common components, underscoring commutativity as a condition for aligned geometric structures in the operator . For non-diagonalizable matrices, the spectral characterization extends to generalized eigenspaces. If AB = BA, then B preserves each generalized eigenspace of A, defined as G_\lambda(A) = \{ v \mid (A - \lambda I)^k v = 0 \text{ for some } k \}. Within these spaces, the Jordan structures are compatible: the Jordan chains of A are mapped by B in a manner that respects the chain lengths and eigenvalue associations, enabling simultaneous upper triangularization with aligned blocks. This compatibility ensures that the parts of the Jordan decompositions interact consistently under commutation.

Examples and Applications

Canonical Examples

One canonical class of commuting matrices consists of all diagonal matrices over the complex numbers. For any two n \times n diagonal matrices A = \diag(a_1, \dots, a_n) and B = \diag(b_1, \dots, b_n), the product is given by AB = \diag(a_1 b_1, \dots, a_n b_n) = BA, demonstrating commutativity directly from the absence of off-diagonal terms in the multiplication. This property holds because diagonal matrices preserve the vectors as eigenvectors, allowing simultaneous diagonalization in the same basis. In contrast, rotation matrices in three dimensions provide a fundamental example of non-commutativity. Consider the rotation matrices for 90-degree rotations around the x- and y-axes: R_x(90^\circ) = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 0 & -1 \\ 0 & 1 & 0 \end{pmatrix}, \quad R_y(90^\circ) = \begin{pmatrix} 0 & 0 & 1 \\ 0 & 1 & 0 \\ -1 & 0 & 0 \end{pmatrix}. Computing the products yields R_x(90^\circ) R_y(90^\circ) = \begin{pmatrix} 0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{pmatrix} \neq \begin{pmatrix} 0 & 1 & 0 \\ 0 & 0 & -1 \\ -1 & 0 & 0 \end{pmatrix} = R_y(90^\circ) R_x(90^\circ). This non-commutativity arises because the rotation group SO(3) is non-abelian, with the order of rotations affecting the final orientation. The from illustrate non-commutativity in a 2x2 context. Defined as \sigma_x = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}, \quad \sigma_y = \begin{pmatrix} 0 & -i \\ i & 0 \end{pmatrix}, \quad \sigma_z = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix}, they satisfy the commutation relations [\sigma_i, \sigma_j] = 2i \epsilon_{ijk} \sigma_k, where \epsilon_{ijk} is the . Explicitly, \sigma_x \sigma_y = \begin{pmatrix} i & 0 \\ 0 & -i \end{pmatrix} = i \sigma_z = -\sigma_y \sigma_x, so [\sigma_x, \sigma_y] = 2i \sigma_z \neq 0. However, subsets such as \{\sigma_x, \sigma_x\} or \{I, \sigma_z\} (where I is the ) do commute. Companion matrices offer an example of a specific commuting family. For a p(\lambda) = \lambda^n + c_{n-1} \lambda^{n-1} + \dots + c_0, the C is the n \times n with 1's on the superdiagonal, -c_0, \dots, -c_{n-1} in the last row, and zeros elsewhere. Any matrix that with C is precisely a polynomial in C, i.e., q(C) for some polynomial q. Thus, the set \{I, C, C^2, \dots, C^{n-1}\} forms a commuting family, as powers of C satisfy C^k C^m = C^{k+m} = C^m C^k. For matrices, commuting examples require specific structures. Consider two 2x2 Jordan blocks for eigenvalue 0, each J = \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}. In a 4x4 block-diagonal form, N_1 = J \oplus J and N_2 = J \oplus J commute, as N_1 N_2 = (J^2 \oplus J^2) = N_2 N_1. However, if one is a 3x3 block J_3 = \begin{pmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{pmatrix} embedded in a larger with a 1x1 zero block, and the other is two 2x2 blocks, they generally do not commute unless the partitions satisfy compatibility conditions (e.g., the conjugate partitions each other). Such pairs are characterized by their Jordan types allowing simultaneous upper-triangularization.

Applications in Physics and Engineering

In quantum mechanics, commuting observables can be simultaneously measured because they share a common set of eigenstates. For instance, the position and momentum operators do not commute, leading to the Heisenberg uncertainty principle that prohibits precise simultaneous measurements of both. In contrast, the total angular momentum operator L^2 and its z-component L_z commute, allowing states to be labeled by simultaneous eigenvalues of both, which simplifies the description of rotational symmetries in atomic and molecular systems. Representation theory provides a framework for understanding symmetries in physical systems, where irreducible representations of abelian groups yield one-dimensional matrices that inherently commute. This property is crucial in quantum mechanics for analyzing systems invariant under abelian symmetry groups, such as translations in free particles, enabling the construction of commuting operator algebras that preserve physical observables under group actions. In , commuting matrices in linear time-invariant systems facilitate analysis by allowing simultaneous , which transforms the system into decoupled modes for easier eigenvalue-based assessment of asymptotic behavior. For switched linear systems, commutation relations between subsystem matrices provide sufficient conditions for overall , reducing the of verifying Lyapunov functions across multiple modes. Signal processing leverages the commutativity of convolution operations, represented by circulant matrices, to enable efficient parallel computation of filtered signals in linear time-invariant systems. This property ensures that the order of applying multiple filters does not affect the output, optimizing algorithms for tasks like audio processing and image enhancement. Recent advancements in exploit groups of commuting Pauli operators for gate design and , partitioning non-commuting terms into clusters that can be simultaneously diagonalized to minimize depth and error rates. For example, post-2020 methods group Pauli strings into commuting families to accelerate variational quantum algorithms and hardware-efficient compilation on noisy intermediate-scale quantum devices. As of 2025, further advances include leveraging commuting groups within Hamiltonians for efficient () circuits, reducing measurement overhead.

Historical Development

Early Foundations

The discovery of non-commutative multiplication in Hamilton's quaternions, introduced by in 1843, marked an early challenge to the commutative laws prevalent in classical algebra and motivated subsequent explorations into non-commutative structures like matrices. Quaternions extend the complex numbers to represent rotations in , forming a division algebra where multiplication is associative but not commutative, such as ij = k while ji = -k. This non-commutativity highlighted the limitations of commutative fields and spurred interest in algebraic systems capable of modeling such behaviors, laying groundwork for matrix theory as a framework for linear transformations. In 1858, formalized the theory of matrices in his seminal paper "A Memoir on the Theory of Matrices," establishing matrices as independent algebraic objects with addition, multiplication, and powers, where multiplication is generally non-commutative. A key result stated by Cayley was the Cayley-Hamilton theorem, which asserts that every A satisfies its own \det(\lambda I - A) = 0, implying p(A) = 0 for the p(\lambda); the first proof was provided by Frobenius in 1878. This theorem links matrices to , demonstrating that A commutes with any polynomial in itself, such as powers A^k, and provided a precursor to studying commutativity within matrix algebras. James Joseph Sylvester, who coined the term "matrix" in 1850, advanced matrix theory in the 1880s through lectures and papers at Johns Hopkins University, introducing concepts like the adjugate and nullity that underpin the structure of matrix commutants. His work on the "centralizer" of a matrix—the set of all matrices commuting with a given one—emerged in this period, framing questions about the dimensionality and form of commutative subalgebras within the full matrix ring. This built on Cayley's foundations, emphasizing how non-commutative matrix multiplication contrasts with commutative scalar algebras while identifying subspaces where commutativity holds. Ferdinand Frobenius contributed foundational results on commuting matrices in the late and , culminating in his that any finite set of matrices can be simultaneously upper triangularized via a . This result, proved using on the and properties of irreducible representations, revealed that commuting families share compatible eigenspaces, distinguishing them from general non-commuting sets. Frobenius's work, alongside earlier efforts by Cayley and , filled a critical pre-20th-century gap by shifting focus from isolated matrices to families, underscoring the interplay between commutativity and simultaneous canonical forms in non-commutative algebras.

Modern Extensions

In 1909, formalized aspects of the simultaneous triangularization of commuting families of matrices over the complex numbers, establishing that any such family can be simultaneously upper triangularized via a , extending his earlier work on the for single matrices. This result relies on the existence of a common and the application of , which asserts that operators commuting with an are scalar multiples of the identity. For commuting normal matrices, simultaneous via a unitary is possible. Schur's framework provided a foundational tool for analyzing the structure of commuting operators in finite-dimensional spaces. During the 1920s, integrated these ideas into the of compact Lie groups, showing that irreducible unitary representations lead to families of matrices that can be simultaneously diagonalized, leveraging the complete reducibility of representations for compact groups. Weyl's approach, detailed in works such as his paper on spectra of finite groups and subsequent developments in his 1931 book on classical groups, emphasized the role of matrices in decomposing representations into simultaneous eigenspaces, bridging linear algebra with . In the , numerical methods for computing centralizers—the sets of matrices with a given —advanced through extensions of routines solving the associated AX - XA = 0. These include the TRSYL routine for triangular cases and iterative solvers like the Bartels-Stewart algorithm, which reduce the problem via and compute the null space efficiently, with version 3.7 (2016) incorporating improved stability for large-scale computations. Such algorithms enable practical applications in and optimization, achieving high accuracy for matrices up to order 1000 on modern hardware. Commuting matrices play a key role in the structure of abelian subgroups of the general linear group GL(n, ℂ), where maximal abelian subalgebras correspond to simultaneously triangularizable families, aiding the classification of semisimple Lie algebras and their representations. Post-2000 developments highlight connections to quantum information theory, particularly in stabilizer codes for quantum error correction, where the stabilizer group is generated by a commuting set of Pauli matrices, allowing fault-tolerant quantum computation by encoding logical qubits in joint eigenspaces. For instance, the [[7,1,3]] Steane code uses six commuting Pauli operators to correct single-qubit errors, demonstrating the practical impact in fault-tolerant quantum systems.

References

  1. [1]
    Commuting Matrices - BOOKS
    4.3 Commuting Matrices​​ ¶ Suppose two operators M and N commute, [M,N]=0. [ M , N ] = 0 . Then if M has an eigenvector |v⟩ with non-degenerate eigenvalue λv, we ...
  2. [2]
    [PDF] Matrix Theory, Math6304 Lecture Notes from September 6, 2012
    Sep 6, 2012 · 32 Theorem. Let A, B ∈ Mn be diagonalizable. Then AB = BA if and only if A and B are simultaneously diagonalizable. Proof.
  3. [3]
    [PDF] Simultaneous commutativity of operators - Keith Conrad
    For linear operators to be simultaneously diagonalizable, they at least have to be indi- vidually diagonalizable, but more is needed (see Example 1). A further ...
  4. [4]
    [PDF] Diagonalization by a unitary similarity transformation
    Two hermitian matrices are simultaneously diagonalizable by a unitary simi- larity transformation if and only if they commute. That is, given two hermitian.
  5. [5]
    [PDF] Quantum Theory I, Recitation 1 Notes - MIT OpenCourseWare
    Thus, A and B are simultaneously diagonalizable. This proves the backward direction: if A and B commute, then they are simultaneously diagonalizable.
  6. [6]
    [PDF] Families of Commuting Normal Matrices
    Definition M.3 (Commuting) Two n × n matrices A and B are said to commute if. AB = BA. Lemma M.4 Let n ≥ 1 be an integer, V be a linear subspace of Cn of ...
  7. [7]
    [PDF] Linear Equations and Matrices - UCSD CSE
    The commutator of two matrices A, B ∞ Mn(F) is defined by [A, B] = AB - BA, and the anticommutator is given by [A, B]+ = AB + BA. (a) Show that [ßá, ßé] ...
  8. [8]
    [PDF] Matrix Centralizers and their Applications Alexander E. Guterman
    Algebra and its Applications, 18(1), 2019, 1-15. Page 4. Definition. For A ∈ Mn(F) its centralizer C(A) = {X ∈ Mn(F)| AX = XA} is the set of all matrices ...<|control11|><|separator|>
  9. [9]
    [PDF] Numerical, spectral, and group properties of random butterfly matrices
    Note first commuting matrices preserve eigenspaces: if λ is an eigenvalue of A with eigenspace VA(λ), then for v ∈ VA(λ), we have. A(Bv) = B(Av) = B(λv) ...
  10. [10]
    Centralizer of a Matrix over a Finite Field - MathOverflow
    Aug 19, 2012 · To get the centralizer of the original matrix, you would add these numbers over all primary parts. Finally, raising q to this number is the ...Centralizers in semisimple Lie group - MathOverflowCentralizer of an element in a matrix Lie group whose Jordan form is ...More results from mathoverflow.net
  11. [11]
    [PDF] Two classical theorems on commuting matrices
    (2) An arbitrary set of commuting normal matrices may be simultaneously brought to diagonal form by a unitary similarity.
  12. [12]
    [PDF] Lecture 6 — Generalized Eigenspaces & Generalized Weight Spaces
    Sep 28, 2010 · (a) All generalized eigenspaces of A are B-invariant (b) if A = As +An is the classical Jordan decomposition, then B commutes with both As and ...
  13. [13]
    [PDF] Rotations in 3-Dimensional Space
    Rotations do not commute in general, so that R1R2 6= R2R1, in general. It follows from the definition that if. R, R1 and R2 are rotation operators, then so are ...
  14. [14]
    Pauli Two-Component Formalism - Richard Fitzpatrick
    It is also easily seen that the Pauli matrices satisfy the anti-commutation relations. $\displaystyle \{ \sigma_i, \sigma_j \} = 2 \,, (492). Here, $ \{a,b ...
  15. [15]
    [PDF] Positive entries of stable matrices
    Aug 11, 2004 · The matrix C1(ζ) is the companion matrix of the polynomial q(x) = ... every matrix that commutes with C is a polynomial in C. Therefore ...<|separator|>
  16. [16]
    [PDF] Products of commuting nilpotent operators
    Thus E14 is a product of two commuting square-zero matrices. ✷ We denote by Jµ = J(µ1,µ2,...,µt) = Jµ1 ⊕ Jµ2 ⊕ ... ⊕ Jµt the upper triangular nilpotent matrix in ...
  17. [17]
    11.1: Eigenstates and Commuting Operators - Physics LibreTexts
    Dec 8, 2021 · In general, a complete set of eigenvectors for a given operator do form a basis set that can be used to construct any vector that is part of the ...
  18. [18]
    2.5: Operators, Commutators and Uncertainty Principle
    Mar 3, 2022 · The commutator of operators A and B is [A, B] = AB - BA. The uncertainty principle states ΔA ΔB ≥ 1/2 |[A, B]| and position and momentum do not ...
  19. [19]
    4.2: Quantum Mechanics in 3D - Angular momentum
    Apr 29, 2023 · The sum of angular momentum satisfy the general commutation rules, \(\left[L^{2}, L_{z}\right]=0,\left[L_{x}, L_{y}\right]=i L_{z}\) etc. We can ...
  20. [20]
    [PDF] Representation Theory - UC Berkeley math
    Any irreducible complex representation of an abelian group G is one-dimensional. Proof. Let g ∈ G and call ρ the representation. Then, ρ(g) commutes with every ...
  21. [21]
    [PDF] Chapter 4: Introduction to Representation Theory - Rutgers Physics
    Theorem 2: If A is a complex matrix commuting with an irreducible matrix repre- ... 11.1 The irreducible representations of abelian groups. Suppose G is abelian ...
  22. [22]
    Stability of Matrix Differential Equations with Commuting Matrix ...
    Abstract. Sufficient conditions for the asymptotic stability of systems of first order linear differential equations with commuting matrix constant coefficients ...
  23. [23]
    [PDF] Commutation relations and stability of switched systems: a personal ...
    This expository article presents an overview of research, conducted mostly between the mid-1990s and late 2000s, that explores a link between commutation ...
  24. [24]
    LTI (linear, time-invariant) operators commute
    May 4, 2023 · Convolution commutes, so linear time-invariant operators commute. Suppose the effect of applying L1 to a sequence x is to take the convolution ...
  25. [25]
    Convolution - Notes on AI
    Jul 27, 2025 · One special property of circulant matrices is that unlike normal matrix multiplication circulant matrix multiplications commute. i.e C ( w ) C ...
  26. [26]
    Circuit optimization of Hamiltonian simulation by simultaneous ...
    Sep 12, 2020 · In this paper we reduce the circuit complexity of Hamiltonian simulation by partitioning the Pauli operators into mutually commuting clusters ...Missing: post- | Show results with:post-
  27. [27]
    Fast partitioning of Pauli strings into commuting families for optimal ...
    The Pauli strings appearing in the decomposition of an operator can be can be grouped into commuting families, reducing the number of quantum circuits ...
  28. [28]
    [PDF] Algebra and Geometry of Hamilton's Quaternions
    was partly motivated by specu- lations on how time and space may be “girdled” together, as well as some vague notions. (based on Kantian philosophy) of geometry ...Missing: studies | Show results with:studies
  29. [29]
    II. A memoir on the theory of matrices - Journals
    the equations X = ax +by +cz, Y = a'x +b'y +c'z, Z = a"x +b"y +c"z, may be more simply represented by ( X, Y, Z)=( a, b, c )(x, y, z), | a', b', c' | | a", b", ...
  30. [30]
    [PDF] Cayley, Sylvester, and Early Matrix Theory - School of Mathematics
    Nov 20, 2007 · The theory of matrices in the 19th century. In Pro- ceedings of the International Congress of Mathematicians, Vancouver, vol- ume 2, 1974 ...
  31. [31]
    Computational Methods for Linear Matrix Equations | SIAM Review
    Our aim is to provide an overview of the major algorithmic developments that have taken place over the past few decades in the numerical solution of this and ...
  32. [32]
    [PDF] On systems of commuting matrices, Frobenius Lie algebras ... - arXiv
    Jul 29, 2021 · This work relates to three problems, the classification of maximal Abelian subalgebras (MASAs) of the Lie algebra of square matrices, the ...
  33. [33]
    [PDF] Quantum error-correcting codes and their geometries - arXiv
    This is an expository article aiming to introduce the reader to the under- lying mathematics and geometry of quantum error correction. Information stored on.