Fact-checked by Grok 2 weeks ago

Spectral graph theory

Spectral graph theory is the study of properties of graphs through the eigenvalues and eigenvectors of matrices associated with them, particularly the adjacency matrix and the Laplacian matrix. This field bridges linear algebra and graph theory, revealing insights into graph structure, such as connectivity and expansion, via spectral properties. Key concepts include the adjacency matrix A, where entries a_{i,j} = 1 if vertices i and j are adjacent and 0 otherwise, and the Laplacian matrix L = D - A, with D as the degree matrix containing vertex degrees on the diagonal. The eigenvalues of L, denoted \lambda_1 \leq \lambda_2 \leq \cdots \leq \lambda_n with \lambda_1 = 0, provide measures of global graph properties; notably, the second smallest eigenvalue \lambda_2 (the Fiedler value) quantifies algebraic connectivity, being positive if and only if the graph is connected. The 's largest eigenvalue \mu_1 is bounded by the average and maximum of the , offering bounds on and structure. Powers of the count walks of corresponding between vertices, enabling applications like up to 3 via simple computations. The Laplacian's x^T L x = \sum_{edges \{a,b\}} w_{a,b} (x(a) - x(b))^2 (for weighted ) connects to cuts and expansions, with expansion c_E(G) satisfying \frac{\lambda_2}{2} \leq c_E(G) \leq \sqrt{2 d \lambda_2}, where d is the maximum . These spectral tools extend to normalized variants of the Laplacian, which are useful in analyzing random walks and mixing times, where smaller non-zero eigenvalues indicate faster convergence. Historically, spectral graph theory traces roots to early 20th-century work on matrix eigenvalues, including Perron-Frobenius theory by Perron (1907) and Frobenius (1912), with foundational graph-specific developments by Fiedler in the 1970s on and nodal domains. Influential contributions include Hoffman's 1970 bounds on independent sets using spectral methods and Godsil and Gutman's 1981 analysis of matching polynomials. The field gained prominence in through surveys like Alon's on spectral techniques for algorithms, emphasizing efficient eigenvalue computations for sparse matrices. Applications span graph algorithms, including partitioning via the Fiedler vector for (achieving near-optimal cuts in random graphs with high probability), coloring random graphs using smallest adjacency eigenvalues, and detecting large cliques or independent sets. In , spectral methods optimize random walks, clustering, and sparsification of graphs while preserving properties. Further uses include construction, error-correcting codes, and solving linear systems over graph Laplacians with iterative solvers like Chebyshev methods.

Graph Matrices and Spectra

Adjacency Matrix

The adjacency matrix A of an undirected simple G with vertex set V = \{v_1, \dots, v_n\} is the n \times n with entries A_{ij} = 1 if v_i and v_j are adjacent and A_{ij} = 0 otherwise, including zeros on the since there are no loops. This encodes the 's edge structure and serves as the primary object in graph theory, where its spectral properties reveal information about and walk counts. The eigenvalues of A, denoted \lambda_1 \geq \lambda_2 \geq \dots \geq \lambda_n, are the roots of the \det(\lambda I - A) = 0, and since A is real symmetric, all eigenvalues are real with the of A (sum of eigenvalues) equal to zero due to the zero diagonal. These eigenvalues, often called the of the , provide bounds on graph parameters like the chromatic number and independence number. For the complete graph K_n, the adjacency matrix is J - I where J is the all-ones matrix and I is the , yielding eigenvalues n-1 with multiplicity 1 and -1 with multiplicity n-1. In contrast, the P_n has eigenvalues $2 \cos\left( \frac{\pi j}{n+1} \right) for j = 1, \dots, n, each with multiplicity 1, reflecting its linear structure and bounded less than 2. By the Perron-Frobenius theorem applied to the nonnegative irreducible of a connected , the largest eigenvalue () \lambda_1 is real, positive, and simple, with a corresponding strictly positive eigenvector, and it satisfies |\lambda_i| \leq \lambda_1 for all i. This eigenvalue equals the average in regular graphs and is bounded above by the maximum (and at least the minimum ) in connected graphs. The powers of the adjacency matrix count walks: the entry (A^k)_{ij} gives the number of walks of length k from v_i to v_j, and in particular, the trace \operatorname{tr}(A^k) equals the total number of closed walks of length k in the graph. For instance, \operatorname{tr}(A^2) = 2|E|, twice the number of edges.

Laplacian Matrix

The Laplacian matrix L of a graph G with vertex set V and edge set E is defined as L = D - A, where A is the adjacency matrix of G and D is the diagonal degree matrix with D_{ii} equal to the degree of vertex i. This construction ensures that L is symmetric and positive semi-definite for undirected graphs without self-loops. The eigenvalues of L, denoted \lambda_1 \leq \lambda_2 \leq \cdots \leq \lambda_n where n = |V|, are all real and non-negative. For a connected , \lambda_1 = 0 is a simple eigenvalue with corresponding eigenvector the all-ones vector \mathbf{1}, reflecting the graph's overall ; the multiplicity of the zero eigenvalue equals the number of connected components in general. The second smallest eigenvalue \lambda_2, known as the , quantifies how well-connected the graph is, with larger values indicating stronger . By the Courant-Fischer applied to L, the eigenvalues satisfy \lambda_i = \min_{\dim S = i} \max_{\substack{x \in S \\ x \neq 0}} \frac{x^T L x}{x^T x} = \max_{\dim T = n-i+1} \min_{\substack{x \in T \\ x \neq 0}} \frac{x^T L x}{x^T x}, where the min-max characterization provides variational bounds useful for analyzing graph structure. The quadratic form associated with L is x^T L x = \sum_{(u,v) \in E} (x_u - x_v)^2 for x \in \mathbb{R}^n, which measures the total variation of x across edges and connects to graph cuts (as a minimizer over indicator vectors) and effective resistance in electrical network interpretations of the graph. For example, in the complete graph K_n, the eigenvalues are $0 with multiplicity 1 and n with multiplicity n-1. In the cycle graph C_n, they are $2 - 2 \cos(2\pi k / n) for k = 0, 1, \dots, n-1.

Other Graph Matrices

The normalized Laplacian matrix addresses limitations of the standard Laplacian for irregular graphs by incorporating degree normalization. Defined as \mathcal{L} = D^{-1/2} L D^{-1/2}, where L is the combinatorial Laplacian and D is the , it provides a whose eigenvalues lie in the interval [0, 2]. The smallest eigenvalue is always 0, with multiplicity equal to the number of connected components of the graph. This normalization facilitates comparisons across graphs of varying densities and is particularly useful in applications like spectral partitioning and random walks, where it aligns with processes. For the star graph S_n on n vertices, the normalized Laplacian eigenvalues are 0 (simple), 1 (with multiplicity n-2), and 2 (simple), reflecting the graph's high centrality at the hub vertex. The signless Laplacian matrix Q = D + A, where A is the adjacency matrix, offers an alternative to the standard Laplacian by avoiding the subtraction of A. Its eigenvalues are all non-negative. For a connected , the smallest eigenvalue is 0 with multiplicity 1 if the graph is bipartite, and positive otherwise. In d-regular graphs, the largest eigenvalue equals $2d, providing a direct measure of regularity. This matrix has proven valuable in studying connectivity and expansion properties, often yielding tighter bounds than the Laplacian in certain extremal problems. The B of a , with rows corresponding to vertices and columns to oriented edges (entries \pm 1 for edge endpoints and 0 otherwise), relates directly to the Laplacian via L = B B^T. The of B thus encodes about the Laplacian's eigenvalues, excluding the zero eigenvalue which arises from the of B^T; specifically, the non-zero eigenvalues of L match those of B B^T. This connection is foundational for analyzing cut spaces and cycle spaces in . The Seidel matrix S = J - I - 2A, where J is the all-ones matrix and I the identity, serves as a signed variant of the , particularly for studying switching operations in graphs. In the context of strongly regular graphs, the eigenvalues of S are integers and determine parameters like the eigenvalues of the underlying , enabling classifications and isomorphisms via spectral data. Its spectrum, often consisting of three distinct values, aids in distinguishing non-isomorphic strongly regular graphs with the same parameters.

Basic Spectral Properties

Eigenvalues and Multiplicities

In spectral graph theory, the spectrum of a graph is defined as the multiset of its eigenvalues, typically those of the adjacency matrix or the Laplacian matrix. For the adjacency matrix A of an undirected simple graph with n vertices, the eigenvalues \lambda_1 \geq \lambda_2 \geq \cdots \geq \lambda_n satisfy \sum_{i=1}^n \lambda_i = \operatorname{trace}(A) = 0, since A has zeros on its diagonal. This property implies that the eigenvalues are balanced around zero in certain cases, such as bipartite graphs, where the spectrum is symmetric about the origin—if \lambda is an eigenvalue with multiplicity k, then so is -\lambda. The multiplicity of an eigenvalue provides graph-theoretic insights, particularly for the value 0. In contrast, for the L = D - A, where D is the , the multiplicity of the eigenvalue 0 is exactly the number of connected components of the . These multiplicities highlight how spectral properties encode structural features like bipartiteness and . Since the A and Laplacian L are real symmetric, all eigenvalues are real, and the corresponding eigenvectors can be chosen to form an for \mathbb{R}^n. Bounds on the eigenvalues can be obtained using the : each eigenvalue of A lies in at least one of the disks in the centered at 0 (the diagonal entries) with radius equal to the of the corresponding , so all eigenvalues satisfy |\lambda| \leq \Delta, where \Delta is the maximum . In regular graphs, multiplicities of specific eigenvalues often relate to combinatorial parameters. For a d-regular graph, the multiplicity of the eigenvalue -1 of the is linked to the independence number \alpha via bounds such as the Hoffman ratio bound, which gives \alpha \leq n \frac{1 - \theta_{\min}/d}{1 + \theta_{\min}/d} where \theta_{\min} \leq -1 is the smallest eigenvalue; when -1 has positive multiplicity, it constrains \alpha relative to n and d. For example, in strongly regular graphs, the multiplicity of -1 (if it occurs) helps determine upper bounds on \alpha, as seen in constructions like the derangement graph where \alpha = (n-1)!.

Spectral Radius

The spectral radius of a , denoted \rho(G), is the largest eigenvalue of its A. For a connected G of n with m edges, the Collatz-Sinogowitz states that \rho(A) \geq 2m/n, with if and only if G is . This lower bound highlights the combinatorial significance of the spectral radius in measuring deviation from regularity, as the difference \rho(A) - 2m/n quantifies irregularity. An important upper bound is \rho(A) \leq \Delta, where \Delta is the maximum of G, with equality holding G is or bipartite semi-regular (i.e., bipartite with all vertices in each part having the same ). For k- connected graphs, \rho(A) = k exactly, and this eigenvalue has multiplicity one, corresponding to the all-ones eigenvector. For the Laplacian matrix L = D - A, where D is the , the largest eigenvalue \mu_1(L) satisfies \mu_1(L) \leq 2\Delta, with equality if and only if G is . This bound relates the spectral radius of the Laplacian to the maximum degree, providing insight into the graph's degree structure. A representative example is the , which is 3-regular and thus has \rho(A) = 3.

Eigenvalue Interlacing

Eigenvalue interlacing is a fundamental concept in , arising from the Cauchy interlacing theorem applied to the principal submatrices of symmetric matrices associated with . For the A of a G with n vertices and eigenvalues \lambda_1(G) \geq \lambda_2(G) \geq \cdots \geq \lambda_n(G), consider an H on m < n vertices. The of H is a principal submatrix of A, so its eigenvalues \lambda_1(H) \geq \cdots \geq \lambda_m(H) satisfy \lambda_i(G) \geq \lambda_i(H) \geq \lambda_{i + n - m}(G) for i = 1, \dots, m. This interlacing property implies that the spectrum of H is "sandwiched" between portions of the spectrum of G, providing insights into how structure affects spectral features. A key application of this theorem is in analyzing the effect of vertex deletion. When a single vertex is removed to form an induced subgraph H = G - v, the multiplicity of any eigenvalue changes by at most 1, and more precisely, each eigenvalue of H lies between consecutive eigenvalues of G. This property is instrumental in proving results on spectral determination of graphs, where matching spectra between graphs imply structural similarities through consistent interlacing in their subgraphs. Similar interlacing phenomena hold for the Laplacian matrix L = D - A, where D is the degree matrix. For induced subgraphs, the Laplacian eigenvalues of H interlace those of G in an analogous manner, though the exact form accounts for degree changes. Notably, adding an edge to a graph non-decreases all Laplacian eigenvalues, with the second-smallest eigenvalue \lambda_2 (algebraic connectivity) strictly increasing if the addition improves connectivity. Illustrative examples appear in path graphs P_n, whose adjacency eigenvalues are $2 \cos \left( \frac{\pi k}{n+1} \right) for k = 1, \dots, n. An induced subpath P_m with m < n exhibits strict interlacing, as its eigenvalues fit precisely between those of P_n, demonstrating the theorem's tightness. Hoffman's bound leverages interlacing to constrain graph parameters via eigenvalues. For a k-regular graph G, the independence number \alpha(G) satisfies \alpha(G) \leq n \frac{ -\lambda_{\min} }{k - \lambda_{\min} }, where \lambda_{\min} is the smallest adjacency eigenvalue; this follows from interlacing the all-zero adjacency matrix of a maximum independent set as an induced subgraph.

Cospectral Graphs

Definitions and Examples

In spectral graph theory, two graphs are said to be cospectral with respect to a fixed matrix (typically the ) if they are non-isomorphic but share the same multiset of eigenvalues, known as the . This property highlights the non-uniqueness of spectral information for graph isomorphism, as the spectrum alone does not always determine the graph's structure up to isomorphism. The term "cospectral" is conventionally used in the context of graphs, whereas "isospectral" is a broader concept applied to general or geometric structures like that possess identical spectra for associated operators. A classic example of cospectral graphs is the pair consisting of the 4×4 rook's graph and the Shrikhande graph, both of which are strongly regular with parameters (16, 6, 2, 2). These non-isomorphic graphs share the adjacency spectrum {6¹, 2⁵, (−2)¹⁰}, where superscripts denote multiplicities. Another notable example is the first known pair of cospectral trees, constructed by Collatz and Sinogowitz in 1957, which demonstrated that even trees—acyclic connected graphs—are not always uniquely determined by their spectra. In contrast, certain graph families are determined by their spectra, meaning no non-isomorphic graph shares their eigenvalues. For instance, the complete graph K_n has the adjacency spectrum \{n-1\}^1 \cup \{-1\}^{n-1} and is uniquely identifiable by it.

Graphs Determined by Their Spectrum

In spectral graph theory, a graph is said to be determined by its spectrum, or DS, if no non-isomorphic graph shares the same multiset of eigenvalues for its adjacency matrix (or Laplacian matrix, depending on the context). This property highlights cases where the spectrum uniquely identifies the graph's isomorphism class, in contrast to cospectral graphs that share spectra but differ structurally. The spectrum of a graph provides partial structural information, including the number of vertices, which equals the dimension of the associated matrix, the number of edges, derivable from the sum of the squares of the eigenvalues via the trace of the matrix squared, and connectedness, since no connected graph is cospectral with a disconnected one for the adjacency spectrum. Several well-known graph classes are DS with respect to their adjacency spectrum. Complete graphs K_n are DS, as their spectrum consists of n-1 (multiplicity 1) and -1 (multiplicity n-1). Paths P_n are DS, with distinct spectra reflecting their linear structure. Stars K_{1,n} are DS, uniquely identified by their eigenvalue \sqrt{n} of multiplicity 1, -\sqrt{n} of multiplicity 1, and 0 of multiplicity n-1. Complete bipartite graphs K_{m,n} with m \neq n are DS, while cycles C_n are DS with respect to the Laplacian spectrum, where eigenvalues are $2 - 2\cos(2\pi k/n) for k = 0, \dots, n-1. The Petersen graph serves as a prominent example of a DS graph for the adjacency spectrum, with eigenvalues 3 (multiplicity 1), 1 (multiplicity 5), and -2 (multiplicity 4), uniquely distinguishing it from non-isomorphic graphs. Certain theorems establish broader classes of DS graphs. Many distance-regular graphs are DS, particularly those with parameters ensuring no cospectral mates exist, such as odd graphs and certain strongly regular graphs. Schwenk's theorem reveals that trees are DS only in specific cases, such as paths and stars, while almost all trees possess non-isomorphic cospectral mates constructed via limb switching, rendering them non-DS.

Constructions of Cospectral Graphs

One prominent method for constructing cospectral graphs involves switching techniques, particularly the Godsil-McKay switching procedure, which preserves the spectrum of the adjacency matrix while altering the graph's structure. This approach relies on identifying a suitable equitable partition of the vertex set into cells C_1, \dots, C_t, D, where vertices in D each connect to exactly half the vertices in each C_i (assuming even cell sizes), and then switching the connections by complementing edges between D and the C_i's. The resulting graph remains cospectral to the original because the operation corresponds to a similarity transformation on the adjacency matrix that leaves eigenvalues unchanged. Another construction leverages the cone operation, which adds a universal vertex adjacent to all vertices of a base graph \Gamma. If two non-isomorphic graphs G and H are cospectral, their cones—formed by joining each with K_1—are also cospectral, as the characteristic polynomial of the cone is determined by that of the base graph in a manner that preserves the spectrum. This method is particularly useful for generating families of cospectral graphs from known bases, preserving non-isomorphism. Constructions based on integral graphs, where all adjacency eigenvalues are integers, often utilize equitable partitions to generate cospectral mates. An equitable partition divides the vertices into cells where each vertex in one cell has the same degree to every other cell, allowing the spectrum to be derived from the eigenvalues of the quotient matrix plus contributions from intra-cell structures. For integral graphs, such partitions facilitate switching or quotient-equivalent modifications that yield non-isomorphic mates with identical integer spectra, as the integer eigenvalues ensure matching multiplicities without fractional discrepancies. Examples include cubic integral graphs on 8 vertices with spectrum \pm 3, (\pm 1)^3, where equitable refinements produce cospectral variants. Computational tools like the nauty software package enable systematic discovery of cospectral mates for small graphs by generating all non-isomorphic graphs up to a given order, computing their spectra, and identifying matches via canonical labeling and eigenvalue comparison. , developed for automorphism group computation and graph isomorphism testing, has been used to enumerate cospectral pairs up to 10 vertices, revealing patterns in non-uniqueness for orders around 8, where manual verification confirms non-isomorphism. This approach complements theoretical constructions by providing exhaustive checks for low orders. Recent research (as of 2025) continues to develop new methods for constructing , including variants of and , as well as extensions to using . A concrete illustration of arises with graphs on 8 vertices, where yields 10 distinct pairs of non-isomorphic , such as variants of the and related , each sharing the same (e.g., multiplicities of eigenvalues like 2 with multiplicity 4 and 0 with multiplicity 4) while differing in connectivity or cycle structures. These pairs demonstrate the method's efficacy in small cases, often relying on to verify spectral preservation.

Key Inequalities

Cheeger Inequality

The Cheeger constant, also known as the edge expansion or isoperimetric number, of an undirected graph G = (V, E) is defined as h(G) = \min_{S \subseteq V, \, 0 < |S| \leq |V|/2} \frac{|E(S, S^c)|}{|S|}, where E(S, S^c) denotes the number of edges with one endpoint in S and the other in its complement S^c = V \setminus S. For regular graphs of degree d, this measures the minimum edge boundary relative to subset size, capturing the graph's expansion properties. For irregular graphs, a volume-adjusted variant uses h(G) = \min_S |E(S, S^c)| / (\min(|S|, |S^c|) \, d_{\min}), where d_{\min} is the minimum degree, providing a lower bound on the conductance by approximating volumes with d_{\min} |S| \leq \mathrm{vol}(S). In spectral graph theory, the Cheeger inequality relates h(G) to the second smallest eigenvalue \lambda_2 of the graph Laplacian L = D - A, where A is the adjacency matrix and D is the degree matrix. For d-regular graphs, the inequality states \frac{\lambda_2}{2} \leq h(G) \leq \sqrt{2 d \lambda_2}, with \lambda_2 being the algebraic connectivity or Fiedler value. This bidirectional bound links combinatorial expansion to spectral properties: a small \lambda_2 implies poor expansion (existence of a sparse cut), while a large \lambda_2 certifies good expansion. For the normalized Laplacian \mathcal{L} = I - D^{-1/2} A D^{-1/2}, the eigenvalues are scaled between 0 and 2, and the inequality adapts accordingly, with the same form holding approximately for irregular graphs using the volume-based h(G). The lower bound h(G) \geq \lambda_2 / 2 follows from the Rayleigh quotient characterization of \lambda_2 = \min_{f \perp \mathbf{1}} R(f), where R(f) = \sum_{uv \in E} (f(u) - f(v))^2 / \sum_v f(v)^2. Let S be the minimizing set for h(G), \alpha = |S|/|V| \leq 1/2, and g = \chi_S - \alpha \mathbf{1}. Then g \perp \mathbf{1}, g^T L g = |E(S, S^c)|, and g^T g = |V| \alpha (1 - \alpha) \geq |S|/2. Thus, R(g) = |E(S, S^c)| / g^T g \leq 2 |E(S, S^c)| / |S| = 2 h(G), so \lambda_2 \leq 2 h(G). The upper bound h(G) \leq \sqrt{2 d \lambda_2} is more involved, relying on the , which bounds edge discrepancies across subsets: for disjoint S, T \subseteq V, |E(S, T) - \frac{d |S| |T|}{|V|}| \leq \lambda \sqrt{|S| |T| \left(1 - \frac{|S|}{|V|}\right) \left(1 - \frac{|T|}{|V|}\right)}, where \lambda relates to the (with \lambda_2 = d - \mu_2 for adjacency eigenvalues \mu_i). Combining this with a sweep over level sets of the (the eigenvector corresponding to \lambda_2) identifies a set S with small boundary relative to \sqrt{\lambda_2}. A detailed proof uses coarea-like arguments or random projections to derive the square-root dependence. For regular graphs, a discrete approximation often simplifies to h(G) \approx \sqrt{2 \lambda_2}, emphasizing the square-root bottleneck in the upper bound, which highlights limitations in using spectra for exact cut-finding (approximation factor \sqrt{2 d}). A representative example is the n-dimensional Q_n, which is n-regular with $2^n vertices. Here, h(Q_n) = 1 (achieved by taking S as an (n-1)-dimensional subcube, with |E(S, S^c)| = 2^{n-1} and |S| = 2^{n-1}), and \lambda_2 = 2, making the lower bound tight: $2/2 = 1 = h(Q_n). This illustrates how the inequality certifies expander-like behavior in hypercubes despite their simple structure.

Hoffman-Delsarte Inequality

The Hoffman-Delsarte inequality provides a spectral bound on the independence number of regular graphs. For a d-regular graph G on n vertices with adjacency matrix eigenvalues d = \mu_1 \geq \mu_2 \geq \cdots \geq \mu_n, the size of the largest independent set \alpha(G) satisfies \alpha(G) \leq n \frac{-\mu_n}{d - \mu_n}. This bound leverages the smallest eigenvalue \mu_n to limit the maximum independent set, with equality in certain strongly regular graphs. This result is particularly powerful for graphs arising from association schemes or strongly regular graphs, where the spectrum can be computed explicitly from combinatorial parameters. In strongly regular graphs, defined by parameters (n, k, \lambda, \mu) with n vertices, degree [k](/page/K), and intersection numbers [\lambda](/page/Lambda) and [\mu](/page/MU), the eigenvalues consist of [k](/page/K) with multiplicity 1, and two additional distinct eigenvalues \theta > 0 > \tau given by the roots of the x^2 - (\lambda - \mu)x + (\mu - k) = 0, yielding \theta, \tau = \frac{(\lambda - \mu) \pm \sqrt{(\lambda - \mu)^2 + 4(k - \mu)}}{2}, with multiplicities f = \frac{1}{2} \left( (n-1) + \frac{(n-1)(\mu - \lambda) + 2k}{\sqrt{(\lambda - \mu)^2 + 4(k - \mu)}} \right) and g = n - 1 - f, which must be positive integers. These ensure the eigenvalues are real and integral when the parameters satisfy the necessary conditions from the intersection array. This framework generalizes to association schemes, where the Bose-Mesner algebra—spanned by the adjacency matrices A_i of the relations—admits a basis of primitive orthogonal idempotents E_j with eigenvalues given by the entries P_{j}(i) of the eigenvalue matrix P. The eigenvalues of each A_i are thus \theta_j^{(i)} = P_j(i), and the Hoffman-Delsarte bound can be applied using the extremal eigenvalues. The proof of the bound proceeds by considering the for the characteristic vector of an independent set, leading to constraints from the positive semidefiniteness of certain matrices involving the eigenspaces. A representative example is the , a with parameters (10, 3, 0, 1). Its eigenvalues are $3 (multiplicity 1), $1 (multiplicity 5), and -2 (multiplicity 4). Applying the Hoffman-Delsarte bound with \mu_n = -2, d=3, n=10: \alpha(G) \leq 10 \cdot 2 / (3 + 2) = 4, and indeed the Petersen graph has independence number 4.

Applications

Graph Partitioning

Spectral partitioning leverages the eigenvectors of the graph Laplacian to identify balanced cuts that minimize inter-cluster connections while respecting cluster sizes. A foundational approach uses the Fiedler vector, the eigenvector corresponding to the second-smallest eigenvalue \lambda_2 of the unnormalized L = D - A, where D is the and A is the . This vector provides a natural ordering of vertices that reveals the graph's connectivity structure, enabling effective bipartitioning. To bipartition the graph, one common method sorts the vertices by their values in the Fiedler vector and applies a sweep cut, selecting the partition that minimizes the cut size relative to the cluster volumes. Alternatively, vertices can be divided based on the sign of the Fiedler vector components, grouping positive and negative entries into separate sets, though sweep cuts often yield better balance. These techniques approximate solutions to NP-hard partitioning problems by relaxing discrete objectives into continuous eigenvector computations. The ratio cut objective formalizes balanced partitioning by minimizing the cut size normalized by the product of the sizes of the partitions: \text{RatioCut}(S, S^c) = \frac{|E(S, S^c)|}{|S| \cdot |S^c|}, where E(S, S^c) denotes the edges crossing the cut. Spectral methods approximate this via the of the Laplacian, using the Fiedler vector to obtain a near-optimal . Similarly, the normalized cut criterion, \text{Ncut}(S, S^c) = \frac{|E(S, S^c)|}{\text{vol}(S)} + \frac{|E(S, S^c)|}{\text{vol}(S^c)}, where \text{vol}(S) is the sum of degrees in S, addresses volume balance and is relaxed using the normalized Laplacian's eigenvectors for improved segmentation in dense graphs. Spectral embeddings can enhance local search heuristics like the Kernighan-Lin algorithm, which iteratively swaps boundary vertices to reduce cut size. By initializing with a partition—projecting vertices into a low-dimensional via Laplacian eigenvectors—the heuristic converges faster to high-quality solutions, combining global spectral insights with local refinements. Multilevel partitioning extends spectral methods by recursively coarsening the through edge contraction, computing a spectral on the coarse graph, and then refining upward via projections and local improvements like Kernighan-Lin. This hierarchical approach scales to large graphs while preserving partition quality. For visualization and clustering, spectral embeddings project graphs into 2D using the first two non-trivial Laplacian eigenvectors, akin to the Kamada-Kawai layout which minimizes stress in force-directed models but can incorporate coordinates for better preservation of global structure.

Expander Graphs

Expander graphs are families of d- graphs G_n on n vertices, where n grows without bound, such that the Cheeger constant h(G_n) \geq c > 0 for some fixed c independent of n. The Cheeger constant h(G) quantifies the graph's expansion properties by measuring the minimum relative size of the edge boundary of vertex subsets, and it relates to the via the Cheeger inequality. In spectral graph theory, expander graphs are characterized by a uniform spectral gap in the eigenvalues of the adjacency matrix: for the normalized Laplacian or adjacency matrix of a d-regular graph, the second-largest eigenvalue \lambda_2 satisfies \lambda_2 \leq d - \gamma for some fixed \gamma > 0 independent of n. A fundamental limitation on this gap is given by the Alon--Boppana bound, which states that in any family of d-regular graphs on n vertices with n \to \infty, \liminf_{n \to \infty} \lambda_2(G_n) \geq 2\sqrt{d-1}. Graphs achieving the optimal bound \lambda_2 \leq 2\sqrt{d-1} (with all other nontrivial eigenvalues bounded similarly) are known as Ramanujan graphs and represent the best possible expanders. Explicit constructions of Ramanujan graphs were first provided by Lubotzky, Phillips, and Sarnak, who built infinite families of such graphs as Cayley graphs of the \mathrm{PSL}_2(\mathbb{F}_q) over finite fields, for d-1 a . Another influential construction technique is the zig-zag product, introduced by Reingold, Vadhan, and Wigderson, which combines a base expander with a smaller to iteratively produce explicit constant-degree expander families with strong spectral expansion guarantees. Expander graphs find key applications in error-correcting codes, where their expansion properties enable the construction of codes with optimal rate and relative distance, such as expander codes that achieve on erasure channels. In pseudorandomness, they serve as building blocks for pseudorandom generators and extractors that produce nearly uniform outputs from weak sources while fooling polynomial-time distinguishers.

Historical Development

Early Foundations

The foundations of spectral graph theory were laid in the mid-20th century, building on earlier matrix-based analyses of s. While the concept appeared implicitly in studies prior to the , its properties began to be systematically explored in the and 1960s by pioneers including Lothar Collatz and Ulrich Sinogowitz, who investigated eigenvalues of the in relation to expansion and structural invariants. A landmark contribution came in 1957 with the paper "Spektren endlicher Grafen" by Collatz and Sinogowitz, which introduced the spectrum of a graph as the multiset of eigenvalues of its adjacency matrix and provided the first examples of cospectral graphs—non-isomorphic graphs with identical spectra. Their work also examined how the largest eigenvalue relates to average degree and graph irregularity, establishing early bounds on spectral determination of graph isomorphism. In the late , Dragoš Cvetković's doctoral research further developed these ideas, focusing on the eigenvalues and their role in counting walks of various lengths in a . His 1970 thesis and subsequent publications introduced the notion of main eigenvalues—those with maximal multiplicity—and linked the to the graph's connectivity and the growth rate of walks, providing tools for analyzing non-regular graphs. In the early , Miroslav Fiedler developed the theory of Laplacian eigenvalues, introducing as the second smallest eigenvalue, which quantifies a graph's and aids in partitioning. Early applications of techniques appeared in the , notably in via algebraic structures related to distance-regular graphs. The era culminated in the 1980 monograph Spectra of Graphs by Cvetković, Michael Doob, and Horst Sachs, which compiled and expanded upon the foundational results, including theorems on cospectrality and eigenvalue bounds, solidifying as a distinct branch of .

Modern Developments

In the 1980s, the discrete analogue of Cheeger's inequality marked a pivotal advancement, linking the second smallest eigenvalue of the normalized Laplacian (the ) to a graph's conductance, which measures its properties. Proved by Alon and Milman in 1985, this inequality states that the conductance h(G) satisfies \frac{\lambda_2}{2} \leq h(G) \leq \sqrt{2 d \lambda_2}, where d is the maximum , providing a bridge between spectral properties and cut-based metrics for both regular and irregular graphs. This result spurred applications in algorithm design and theoretical bounds on mixing times. Concurrently, Lubotzky, , and Sarnak constructed explicit families of Ramanujan graphs in 1988, which achieve the optimal spectral gap of $2\sqrt{d-1} for d-regular graphs, confirming Alon's and yielding the strongest known explicit expanders for parallel computation and . The 1990s and 2000s saw spectral graph theory extend into interdisciplinary applications, notably and . In , Shi and 's 2000 normalized cuts algorithm partitions graphs by minimizing the normalized cut criterion using the eigenvectors of the normalized Laplacian L_{norm} = I - D^{-1/2} A D^{-1/2}, where A is the and D the ; this handles non-regular graphs effectively, as demonstrated in tasks where it outperforms traditional k-means by capturing global structure via low-frequency eigenvalues. Normalized spectra thus became central to modern applications, addressing limitations of unnormalized approaches in irregular networks. Parallel to this, quantum walks on graphs gained traction in the early 2000s as quantum counterparts to classical random walks, with their evolution governed by the graph's adjacency or Laplacian eigenvalues; Aharonov et al.'s 2001 framework showed that continuous-time on graphs yield quadratic speedups for spatial search problems compared to classical counterparts, leveraging for analysis. In the , techniques advanced property testing and numerical methods through higher eigenvalues and sparsification. Agreement testing, a generalization of consistency checks, utilizes higher-order eigenvalues of high-dimensional to verify local agreements in distributed proofs; Dinur, Evra, Kaufman, and Kazhdan proved in 2017 that in high dimensions implies robust agreement , enabling efficient testing of global properties with local queries. Complementing this, Spielman and Teng's sparsifiers from 2008 (refined in ) construct sparse graphs H such that (1-[\epsilon](/page/Epsilon)) L_G \preceq L_H \preceq (1+[\epsilon](/page/Epsilon)) L_G in the Loewner for the Laplacians L_G and L_H, preserving the full with O(n \log n / \epsilon^2) edges; this facilitates faster solvers for Laplacian systems and optimization on massive graphs. Despite these progresses, key open problems persist. The complete classification of distance-regular graphs and broader families determined by their spectrum (DS graphs)—those uniquely identified by their adjacency eigenvalues—remains unresolved, with known results limited to specific classes like and strongly regular graphs despite decades of study. Similarly, constructing optimal bounded-degree high-dimensional expanders with the best possible local spectral gaps across all dimensions is an active challenge, as current explicit constructions fall short of conjectured bounds for applications in and sampling.

References

  1. [1]
    [PDF] an introduction to spectral graph theory
    Abstract. Spectral graph theory is the study of properties of the Laplacian matrix or adjacency matrix associated with a graph. In this paper, we focus.Missing: survey | Show results with:survey
  2. [2]
    [PDF] Spectral and Algebraic Graph Theory - Computer Science
    1.3 Spectral Theory. We now review the highlights of the spectral theory for symmetric matrices. Almost all of the matrices we consider will be symmetric or ...Missing: survey | Show results with:survey
  3. [3]
    [PDF] Spectral Techniques in Graph Algorithms - Math (Princeton)
    This is mostly a survey paper and hence the focus here is on the underlying ideas and not on the detailed proofs which can be found in the relevant references.
  4. [4]
    [PDF] Spectra of graphs - CWI
    By Perron-Frobenius theory, the largest eigenvalue of a connected graph goes ... Proposition 3.9.1 Let Γ be a graph with adjacency matrix A (with eigenvalues.
  5. [5]
    [PDF] The Laplacian eigenvalues of graphs: a survey - arXiv
    Nov 12, 2011 · Abstract. The Laplacian matrix of a simple graph is the difference of the diagonal matrix of vertex degree and the (0,1) adjacency matrix.
  6. [6]
    [PDF] Eigenvalues of the Laplacian of a Graph
    Oct 6, 1971 · If G has n vertices, then A(G) * A(G) = A(Kn). ifu is the vector with all components 1, then A(G)u - A(G)u = A(K )u = 00 If A(G)x = Ax for some ...Missing: K_n | Show results with:K_n<|separator|>
  7. [7]
    [PDF] the laplacian spectrum of graphs - UNI-Lj
    The Laplacian matrix of a graph and its eigenvalues can be used in several areas of mathematical research and have a physical interpretation in various physical ...
  8. [8]
    Laplacian of graphs and algebraic connectivity - EuDML
    Laplacian of graphs and algebraic connectivity. Miroslav Fiedler · Banach Center Publications (1989). Volume: 25, Issue: 1, page 57-70; ISSN: 0137-6934 ...
  9. [9]
    [PDF] Eigenvalues and the Laplacian of a graph - Fan Chung Graham
    For the star Sn on n vertices, the eigenvalues are 0, 1 (with multiplicity n − 2), and 2. Example 1.4. For the path Pn on n vertices, the eigenvalues are 1 − ...Missing: P_n | Show results with:P_n
  10. [10]
    [PDF] Properties of the Laplacian, positive semidefinite matricies, spectra ...
    Lemma 15 (Complete graph) The Laplacian for the complete graph Kn on n vertices has eigenvalue 0 with multiplicity 1 and eigenvalue n with multiplicity n ...<|control11|><|separator|>
  11. [11]
    Laplacians of graphs and Cheeger inequalities - Semantic Scholar
    We define the Laplacian for a general graph and then examine several isoperimetric inequalities which relate the eigenvalues of the Laplacian to a number of ...<|control11|><|separator|>
  12. [12]
    Signless Laplacians of finite graphs - ScienceDirect.com
    May 1, 2007 · We survey properties of spectra of signless Laplacians of graphs and discuss possibilities for developing a spectral theory of graphs based on this matrix.
  13. [13]
    [PDF] Spectra of graphs - Andries E. Brouwer
    Feb 1, 2011 · This book shows the influence of Seidel. For other books on spectral graph theory, see Chung [89], Cvetkovic, Doob & Sachs [111] and ...
  14. [14]
    [PDF] Graph Spectrum
    0 is an eigenvalue with multiplicity p + q − 2. Bipartiteness. It turns out that bipartite graphs can be characterized by the spectrum of their adjacency matrix ...
  15. [15]
    [PDF] Lectures on Spectral Graph Theory Fan R. K. Chung
    If G is connected, the eigenvalue 0 has multiplicity 1 since any harmonic eigen- function with eigenvalue 0 assumes the same value at each vertex. Thus, (iv) ...
  16. [16]
    [PDF] Lecture 2 2.1 The Laplacian, again
    Sep 7, 2004 · As the graph is connected, we have xu = xv for all pairs of vertices u, v, which implies that x is some constant times the all 1s vector. Thus, ...
  17. [17]
    [PDF] arXiv:1407.4285v1 [math.CO] 16 Jul 2014
    Jul 16, 2014 · Collatz and Sinogowitz had proposed to measure the departure of a graph G from regularity by the difference of the (ad- jacency) spectral radius ...
  18. [18]
    Spectra of Graphs | SpringerLink
    This book gives an elementary treatment of the basic material about graph spectra, both for ordinary, and Laplace and Seidel spectra.
  19. [19]
    Interlacing eigenvalues and graphs - ScienceDirect.com
    We give several old and some new applications of eigenvalue interlacing to matrices associated to graphs. Bounds are obtained for characteristic numbers of ...<|control11|><|separator|>
  20. [20]
    Cospectral Graphs -- from Wolfram MathWorld
    Cospectral graphs, also called isospectral graphs, are graphs that share the same graph spectrum. The smallest pair of isospectral graphs is the graph union ...
  21. [21]
    On Isospectral Graphs - J-Stage
    Two or more non-isomorphic graphs are called isospectral graphs if they have the same Lapiacian spectra, they are cospectral graphs if they have the same ...
  22. [22]
    Shrikhande Graph -- from Wolfram MathWorld
    The Shrikhande graph is a strongly regular graph on 16 nodes. It is cospectral with the rook graph K_4 square K_4 ; The Shrikhande graph is the smallest distance ...
  23. [23]
    The Construction of Cospectral Graphs with Respect to ... - viXra.org
    Jun 27, 2019 · ... Collatz and Sinogowitz presented a pair of cospectral trees. In 1967 Schwenk proved that for almost all trees there is another tree with the ...
  24. [24]
    [1709.00792] Graphs determined by their $A_α$-spectra - arXiv
    Sep 4, 2017 · We first prove that some graphs are determined by its A_{\alpha}-spectrum for 0\leq\alpha<1, including the complete graph K_m, the star K_{1 ...
  25. [25]
  26. [26]
    Constructing cospectral graphs | Aequationes mathematicae
    Some new constructions for families of cospectral graphs are derived, and some old ones are considerably generalized.
  27. [27]
    View of Cospectral Graphs and Regular Orthogonal Matrices of ...
    For eight vertices we restrict to the case ∆ = Γ. As an application we determine all new switchings for graphs with eight vertices. We. find 68 graphs for which ...
  28. [28]
    [PDF] Four Cheeger-type Inequalities for Graph Partitioning Algorithms
    We will give proofs to four isoperimetric inequalities which are varia- tions of the original Cheeger inequality relating eigenvalues of a graph.
  29. [29]
    [PDF] Four proofs for the Cheeger inequality and graph partition algorithms
    One of the major tools in spectral graph theory is the Cheeger inequality ... In this paper, we examine four Cheeger inequalities and their proofs.
  30. [30]
    [PDF] Spectral graph theory - Csikvári Péter
    The Hoffman-Delsarte bound is surprisingly good in a number of cases. Let's see a bit strange application. A family F = {A1,A2,...,Am} is called ...
  31. [31]
    [PDF] Strongly regular graphs - CWI
    The present volume is a monograph on the topic of Strongly Regular Graphs. So far, no book-length treatment of this subject area has been available.
  32. [32]
  33. [33]
    Partitioning Sparse Matrices with Eigenvectors of Graphs
    Results on the quality of the separators computed by the spectral algorithm are presented, and these are compared with separators obtained from other algorithms ...
  34. [34]
    [PDF] New spectral methods for ratio cut partitioning and clustering
    The new spectral methods use the second smallest eigenvalue of a netlist matrix to approximate ratio cut partitioning cost and compute heuristic ratio cuts.
  35. [35]
    [PDF] Normalized cuts and image segmentation - People @EECS
    SHI AND MALIK: NORMALIZED CUTS AND IMAGE SEGMENTATION. 889. Fig. 1. A case where minimum cut gives a bad partition. Page 3. groups, are in fact identical and ...
  36. [36]
    [PDF] A fast and high quality multilevel scheme for partitioning irregular ...
    Our Multilevel vs Multilevel Spectral Bisection with Kernighan-Lin (MSB-KL) ... The SND algorithm is based on the spectral graph partitioning algorithm described ...
  37. [37]
    [PDF] Expander graphs and their applications - CS - Huji
    Aug 7, 2006 · Ramanujan graphs. In light of the Alon-Boppana bound (Theorem 5.3) we define: Definition 5.11. A d-regular graph G is Ramanujan if λ(G) ≤ 2.
  38. [38]
    Spektren endlicher grafen - Semantic Scholar
    L. Collatz, Ulrich Sinogowitz · Published 1 December 1957 · Mathematics · Abhandlungen aus dem Mathematischen Seminar der Universität Hamburg.
  39. [39]
    The Determinant of the Adjacency Matrix of a Graph | SIAM Review
    Lothar Collatz, Ulrich Sinogowitz, Spektren endlicher Grafen, Abh. Math. Sem. Univ. Hamburg, 21 (1957), 63–77. Crossref · Google Scholar. 3. C. A. Desoer, The ...
  40. [40]
    [PDF] The Largest Eigenvalue of a Graph: A Survey
    Nosal, Eigenvalues of graphs, Master's thesis, University of Calgary, 1970. ... walks in G. (D. M. CVETKovic (CvelS)) t For the definition of a BIBD see ...
  41. [41]
    [PDF] A Public-Key Cryptosystem Based On Algebraic Coding Theory
    In a recent paper Berlekamp, McEliece, and van Tilborg (Ref. 2) proved that the general decoding problem for linear codes is. NP - complete, so one certainly ...
  42. [42]
    λ 1 , Isoperimetric inequalities for graphs, and superconcentrators
    A general method for obtaining asymptotic isoperimetric inequalities for families of graphs is developed. Some of these inequalities have been applied to ...
  43. [43]
    Ramanujan graphs | Combinatorica
    Aug 5, 1987 · Lubotzky, R. Phillips, P. Sarnak, Ramanujan conjecture and explicit construction of expanders,Proc. Stoc. 86 (1986), 240–246. Google ...
  44. [44]
    [quant-ph/0012090] Quantum Walks On Graphs - arXiv
    Dec 18, 2000 · Abstract: We set the ground for a theory of quantum walks on graphs- the generalization of random walks on finite graphs to the quantum world.Missing: spectral 2000s
  45. [45]
    [0808.4134] Spectral Sparsification of Graphs - arXiv
    Aug 29, 2008 · We prove that every graph has a spectral sparsifier of nearly linear size. Moreover, we present an algorithm that produces spectral sparsifiers ...Missing: 2000s | Show results with:2000s
  46. [46]
    Which graphs are determined by their spectrum? - ScienceDirect.com
    For almost all graphs the answer to the question in the title is still unknown. Here we survey the cases for which the answer is known.