The Hamburger moment problem is a classical question in mathematical analysis that asks whether a given infinite sequence of real numbers (m_n)_{n=0}^\infty can be represented as the moments of a positive Borel measure \mu supported on the entire real line \mathbb{R}, meaning m_n = \int_{-\infty}^{\infty} x^n \, d\mu(x) for each nonnegative integer n. Named after the German mathematician Hans Ludwig Hamburger, who formulated and solved it in a series of papers published between 1920 and 1921, the problem extends the earlier Stieltjes moment problem by allowing the support of the measure to be unbounded in both directions rather than restricted to the nonnegative reals.[1][2][3]A necessary and sufficient condition for the existence of such a representing measure is that the sequence (m_n) is positive definite, which means that every finite Hankel matrix H_k = (m_{i+j})_{0 \leq i,j \leq k} is positive semidefinite for all k \geq 0. This condition ensures the existence of at least one measure \mu with the prescribed moments, and it can be checked through the positivity of quadratic forms associated with polynomials. When a solution exists, the problem is termed determinate if the measure \mu is unique and indeterminate otherwise, in which case there are infinitely many distinct measures sharing the same moments. Uniqueness is guaranteed, for example, by Carleman's criterion: if \sum_{n=1}^\infty m_n^{-1/(2n)} = \infty, then the solution is determinate.[1][4][1]The Hamburger moment problem has profound connections to the theory of orthogonal polynomials, continued fractions, and linear functionals on polynomial rings, with applications in approximation theory, probability distributions, and spectral analysis of operators. Hamburger's original work relied on the theory of orthogonal polynomials and continued fraction expansions to characterize solutions, paving the way for later developments by mathematicians like Akhiezer and Krein in studying indeterminate cases and their associated families of measures. Indeterminate problems are particularly rich, often yielding measures whose densities can be parameterized using the Verblunsky coefficients of associated orthogonal polynomials or through the Nevanlinna parametrization involving Pick functions.[3][2][1]
Introduction
Definition
The Hamburger moment problem is the question of determining whether a given sequence of real numbers \{m_n\}_{n=0}^\infty corresponds to the moments of some positive Borel measure \mu on the real line \mathbb{R}, meaning that m_n = \int_{-\infty}^\infty x^n \, d\mu(x) for all n \geq 0, where the total mass \mu(\mathbb{R}) = m_0 > 0 ensures the measure is non-degenerate.[5] This formulation extends the earlier Stieltjes moment problem to the entire real line, allowing support for \mu anywhere in \mathbb{R}.The moments m_n can be interpreted as the expectations \mathbb{E}[X^n] under the measure \mu, if \mu is viewed as the probability distribution of a random variable X (after suitable normalization).[6] The core challenge lies in recovering the underlying measure \mu—or ascertaining its existence—from these power moments alone, without additional information about the support or form of \mu.[7]This problem is foundational in probability theory, where it addresses when a distribution is uniquely identifiable from its moments, and in analysis, serving as a tool for studying quadrature formulas, orthogonal polynomials, and approximation theory on unbounded domains. Frequently, the sequence is normalized so that m_0 = 1, restricting attention to probability measures and simplifying the interpretation of moments as standardized expectations.[5]
Historical Background
The moment problem traces its origins to the late 19th century, emerging as part of efforts to connect sequences of moments with integral representations against positive measures. In 1894, Thomas Jan Stieltjes formulated and solved the problem for measures supported on the non-negative real line [0, ∞), establishing its equivalence to the positive semidefiniteness of associated matrices through his pioneering work on continued fractions. This laid the groundwork for broader investigations into moment sequences and their distributions.In 1920, Hans Ludwig Hamburger extended Stieltjes' framework to the entire real line ℝ, addressing both the determinate and indeterminate cases by deriving necessary and sufficient conditions for the existence of representing measures based on the positive semidefiniteness of Hankel matrices.[8] His seminal paper provided criteria for solvability and explored the structure of solutions, marking a key advancement in the theory. For historical reasons, the real-line version of the problem bears Hamburger's name, distinguishing it from related variants.[9]Shortly thereafter, in 1921, Felix Hausdorff resolved the moment problem for bounded intervals like [0, 1], introducing concepts such as completely monotonic sequences to characterize valid moment sequences on compact supports.[10] Subsequent milestones in the 1940s included contributions from Mark Krein and others, who developed canonical representations for solutions in the indeterminate case and criteria like Krein's condition for assessing uniqueness and the multiplicity of measures.[11] These advancements, building on earlier operator-theoretic approaches by Naimark from 1940–1943, deepened the understanding of non-unique solutions.[12]The Hamburger moment problem has profoundly influenced functional analysis and the theory of orthogonal polynomials, providing foundational tools for studying self-adjoint operators, spectral theory, and indeterminate extensions in Hilbert spaces.[9]
Formulation
Statement of the Problem
The Hamburger moment problem addresses the question of recovering a positive Borel measure on the real line \mathbb{R} from its sequence of power moments. Given a sequence \{m_n\}_{n=0}^\infty of real numbers with m_0 > 0, the problem seeks to determine the existence of a positive Borel measure \mu on \mathbb{R} such thatm_n = \int_{\mathbb{R}} x^n \, d\mu(x)for all n \geq 0. The measures \mu are taken to be positive Radon measures, which are locally finite and inner regular, and may have support extending over the entire infinite line \mathbb{R}.This setup assumes familiarity with integration with respect to Borel measures on \mathbb{R}, where the moments m_n arise as expectations under \mu. No normalization to a probability measure (i.e., m_0 = 1) is required in the general formulation; instead, m_0 > 0 represents the total mass of \mu, allowing for unnormalized measures.The primary goal extends beyond mere existence to the full characterization of the set of all measures \mu compatible with the given sequence \{m_n\}, identifying conditions under which the moments uniquely determine \mu or permit a family of solutions. For instance, the moments of the standard Gaussian measure, defined by m_n = \int_{\mathbb{R}} x^n \frac{e^{-x^2/2}}{\sqrt{2\pi}} \, dx, form a determinate sequence corresponding to a unique measure.
Moment Sequences and Hankel Matrices
A Hamburger moment sequence is a sequence \{m_n\}_{n=0}^\infty of real numbers that arises as the moments of some positive Borel measure \mu on \mathbb{R}, meaning m_n = \int_{\mathbb{R}} x^n \, d\mu(x) for each nonnegative integer n. Such sequences encode the distributional properties of \mu through its power moments, providing an algebraic representation of the measure without specifying \mu explicitly.[8]The primary algebraic tool for analyzing these sequences is the family of Hankel matrices, where the n-th Hankel matrix H_n is the (n+1) \times (n+1) symmetric matrix with entries (H_n)_{i,j} = m_{i+j} for i,j = 0, 1, \dots, n. These matrices capture the bilinear structure of the moments, as the quadratic form associated with H_n corresponds to \sum_{i,j=0}^n m_{i+j} \xi_i \xi_j = \int_{\mathbb{R}} \left( \sum_{k=0}^n \xi_k x^k \right)^2 \, d\mu(x) \geq 0 for any real coefficients \{\xi_k\}, reflecting the positivity of the measure. For \{m_n\} to qualify as a moment sequence, a necessary condition is that all principal minors of each H_n are nonnegative, ensuring the matrices are positive semidefinite (with full details on sufficiency provided in subsequent sections).In the infinite-dimensional setting, the full Hankel matrix (m_{i+j})_{i,j=0}^\infty defines a symmetric bilinear form on the space of finitely supported sequences, preserving the algebraic positivity condition across all finite truncations. This operator-theoretic perspective, while rooted in Hilbert space theory, remains fundamentally algebraic, as it relies on the moment entries to generate consistent quadratic forms. An illustrative example is the Dirac measure \mu = \delta_0 at the origin, yielding m_0 = 1 and m_n = 0 for n > 0; here, H_n is a diagonal matrix with 1 in the (0,0) entry and zeros elsewhere, which is positive semidefinite but singular for n \geq 1, highlighting boundary cases in the positivity requirements.
Existence Conditions
Positive Semidefiniteness of Hankel Matrices
A sequence \{m_n\}_{n=0}^\infty admits a representing measure in the Hamburger moment problem if and only if the associated Hankel matrices H_n = (m_{i+j})_{i,j=0}^n are positive semidefinite for all n \in \mathbb{N}_0, meaning all eigenvalues of H_n are nonnegative, or equivalently (for checking), all principal minors of H_n are nonnegative.[2]The necessity of this condition follows directly from the existence of a positive measure \mu: for any vector \mathbf{v} \in \mathbb{R}^{n+1}, the quadratic form \mathbf{v}^T H_n \mathbf{v} = \int_{\mathbb{R}} \left( \sum_{k=0}^n v_k x^k \right)^2 \, d\mu(x) \geq 0, with equality if the polynomial \sum v_k x^k vanishes \mu-almost everywhere.[2] The sufficiency relies on constructing a positive linear functional \Lambda on the space of polynomials via \Lambda\left( \sum a_k x^k \right) = \sum a_k m_k, whose positive semidefiniteness of the Hankel matrices guarantees \Lambda(p^2) \geq 0 for all polynomials p; by the Riesz representation theorem, this functional extends to a unique positive Borel measure on \mathbb{R}.[5]In the infinite-dimensional setting, if all H_n are positive definite (full rank), then any representing measure must have infinite support. This follows from Carathéodory's theorem applied to the truncated problems, which bounds the minimal number of support points of a representing measure for the moments up to degree $2n by the rank of H_n; since \operatorname{rank}(H_n) = n+1 for all n under positive definiteness, no finite-support measure can reproduce the entire infinite sequence. If some H_n is singular but still positive semidefinite, a finite-support measure may exist.[5]Practically, verifying the condition involves checking that the Hankel matrices H_n are positive semidefinite, for example by computing their eigenvalues (all \geq 0) or all principal minors (\geq 0) for successively larger n; a necessary condition is that the leading principal minors (Hankel determinants) \Delta_n = \det H_n \geq 0. Numerical stability can be an issue for large n due to ill-conditioning of Hankel matrices, but this serves as a feasible test for candidate sequences.[13]A counterexample illustrating failure occurs with the sequence m_0 = 1, m_1 = 0, m_2 = -1: the Hankel matrix H_1 = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix} has \det H_1 = -1 < 0, violating positive semidefiniteness and precluding any representing measure, as no positive \mu can yield a negative second moment.[5]
Hamburger's Theorem on Existence
Hamburger's theorem, published in 1920, establishes the necessary and sufficient condition for the existence of a representing measure in the Hamburger moment problem. Specifically, a real sequence (s_n)_{n=0}^\infty admits a positive Borel measure \mu on \mathbb{R} such that s_n = \int_{\mathbb{R}} x^n \, d\mu(x) for all n \geq 0 if and only if the Hankel matrices H_k = (s_{i+j})_{i,j=0}^k are positive semidefinite for every finite k \in \mathbb{N}_0.[8]The proof of existence relies on the positive semidefiniteness condition, which induces a positive linear functional L_s on the space of polynomials \mathbb{R}, defined by L_s(p) = \sum_{n=0}^d a_n s_n for p(x) = \sum_{n=0}^d a_n x^n. This functional satisfies L_s(p^2) \geq 0 for all polynomials p. To construct the measure, the functional is extended to a positive linear functional on the continuous functions over the one-point compactification \hat{\mathbb{R}} = \mathbb{R} \cup \{\infty\}, where polynomials are treated as restrictions of continuous functions vanishing at infinity. By the Riesz representation theorem applied to this compact space, there exists a unique positive regular Borel measure \tilde{\mu} on \hat{\mathbb{R}} representing the extended functional, and restricting \tilde{\mu} to \mathbb{R} yields the desired measure \mu with the given moments. An alternative construction defines consistent finite-dimensional distributions on \mathbb{R}^m for each m, using the moments to specify probabilities on cylinder sets via multivariate Gauss quadratures or direct integration against polynomials, then applies Kolmogorov's extension theorem to obtain a probability measure on the infinite product space \mathbb{R}^\mathbb{N}, whose marginal on the first coordinate provides \mu.While the theorem ensures at least one such measure exists, it does not guarantee uniqueness, as multiple measures may share the same moments in the indeterminate case.An extension to signed measures, known as the signed Hamburger moment problem, relaxes the positivity to allow finite signed Borel measures on \mathbb{R}, where existence follows if the sequence is such that the associated quadratic forms are bounded, though the primary focus remains the positive case.Hamburger's key innovation was adapting techniques from the Stieltjes moment problem, originally confined to non-negative support, to the unbounded real line by leveraging the theory of positive semidefinite quadratic forms and their connection to measures via integral representations.[8]
Uniqueness and Determinacy
Criteria for Uniqueness
The Hamburger moment problem is determinate if exactly one representing measure μ exists on the real line ℝ, and indeterminate if infinitely many such measures exist.[14]A sufficient condition for determinacy is that the moments grow no faster than exponentially, for example, if ∫ e^{t|x|} dμ(x) < ∞ for some t > 0.[14] This condition implies that the moment generating function has a positive radius of convergence, ensuring the uniqueness of the measure through analytic properties.[14]Carleman's criterion, adapted to the Hamburger setting from the Stieltjes case, states that if \sum_{n=1}^\infty m_{2n}^{-1/(2n)} = \infty, where m_{2n} are the even moments, then the moment problem is determinate.[5] This divergence condition on the reciprocal roots of the moments guarantees a unique representing measure by relating to the quasi-analyticity of associated function classes.[15]The orthogonal polynomials associated with the moment sequence play a role in assessing uniqueness; in the determinate case, the series \sum_{n=0}^\infty p_n(x)^2 / |p_n|^2 converges to the density of μ at points of continuity.[14]A classic example of an indeterminate Hamburger moment problem is the log-normal distribution with density \frac{1}{x \sqrt{2\pi}} \exp\left( -\frac{(\ln x)^2}{2} \right) for x > 0, whose moments m_n = \exp(n^2 / 2) admit multiple representing measures.[15]
Indeterminate Case and Multiple Measures
In the indeterminate case of the Hamburger moment problem, a given momentsequence admits infinitely many representing measures, all sharing the same moments s_n = \int_{-\infty}^{\infty} x^n \, d\mu(x) for n = 0, 1, 2, \dots, rather than a unique solution.[5] These measures form a compact convex set in the weak* topology, and their supports can differ significantly, potentially being discrete, absolutely continuous, or singular with respect to Lebesgue measure.[16] In the indeterminate case, the associated orthogonal polynomials are dense in L^2(μ) precisely for the N-extremal representing measures.The family of all such measures can be parameterized using the Krein-Naimark (or Nevanlinna) approach, which employs Pick functions—analytic functions \phi in the upper half-plane with positive imaginary part, including the point at infinity.[16] For a fixed indeterminate sequence, the Stieltjes transform of the parameterized measure \nu_\phi is given by\int \frac{d\nu_\phi(t)}{t - z} = -\frac{A(z) \phi(z) + C(z)}{B(z) \phi(z) + D(z)},where (A(z), B(z), C(z), D(z)) forms the Nevanlinna matrix with determinant 1, derived from the moment sequence via continued fractions or orthogonal polynomials.[16] This one-parameter family exhaustively describes the solutions, often visualized through continued fraction expansions that converge differently for distinct \phi.[3]Classic examples include the log-normal distribution, with density f(x) = \frac{1}{\sqrt{2\pi} x} \exp\left( -\frac{(\ln x)^2}{2} \right) for x > 0, whose moments s_n = \exp\left( \frac{n^2}{2} \right) yield an indeterminate problem.[5] Perturbations of this, such as d\mu_c(x) = [1 + c \sin(2\pi \ln x)] f(x) \, dx for |c| \leq 1, produce multiple distributions with identical moments, including discrete and singular components.[5] Another instance involves densities like e^{-|x|^\alpha} for $0 < \alpha < 1, which also lead to non-uniqueness.[5]The lack of unique reconstruction has significant implications, particularly in quantum mechanics, where indeterminate Hamburger problems describe the spectral measures of position operators in certain deformed oscillator algebras, allowing multiple possible spectra consistent with the same moments.[17] To resolve this ambiguity in applications, canonical measures are often selected, such as the Friedrichs extension, which is the minimal self-adjoint extension with discrete spectrum and support on the whole real line, or the Krein extension, the maximal one featuring absolutely continuous spectrum.[5] These choices provide standardized representatives for further analysis, like in spectral theory or approximation problems.[16]
Orthogonal Polynomials Connection
Generation of Orthogonal Polynomials
In the context of the Hamburger moment problem, a sequence of orthogonal polynomials \{p_n(x)\}_{n=0}^\infty is defined with respect to a positive measure \mu on the real line \mathbb{R} via the inner product \langle f, g \rangle = \int_{-\infty}^\infty f(x) g(x) \, d\mu(x), where the polynomials satisfy \langle p_m, p_n \rangle = 0 for m \neq n and \langle p_n, p_n \rangle > 0 for each n. These polynomials form an orthogonal basis for the space of square-integrable functions with respect to \mu and obey a three-term recurrence relation of the form x p_n(x) = a_n p_{n+1}(x) + b_n p_n(x) + a_{n-1} p_{n-1}(x), with a_n > 0 and b_n \in \mathbb{R}.[18]The polynomials are generated by applying the Gram-Schmidt orthogonalization process to the sequence of monomials \{1, x, x^2, \dots \}. Specifically, p_0(x) = 1, and for n \geq 0, p_{n+1}(x) is obtained by orthogonalizing x p_n(x) against the subspace spanned by \{p_0(x), \dots, p_n(x)\}, ensuring each p_n(x) has degree n and a positive leading coefficient.[18][19]Often, monic orthogonal polynomials \{\pi_n(x)\}_{n=0}^\infty are considered, where each \pi_n(x) has leading coefficient 1. The squared norm of the monic polynomial is given by \|\pi_n\|^2 = \frac{\det H_n}{\det H_{n-1}}, where H_n = (\mu_{i+j})_{i,j=0}^n is the n+1 \times n+1 Hankel matrix associated with the moment sequence \{\mu_k\}_{k=0}^\infty, with the convention \det H_{-1} = 1. This relation arises from the minimum property of the integral \int (p(x))^2 \, d\mu(x) over monic polynomials of degree n.[19]In the Hamburger setting, these polynomials are orthogonal with respect to a measure supported on the entire real line \mathbb{R}, distinguishing them from cases like the Hermite polynomials, which are specifically orthogonal with respect to the Gaussian measure but serve as a prototypical example. For the standard normal distribution d\mu(x) = \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \, dx, the monic orthogonal polynomials are scaled versions of the probabilists' Hermite polynomials \mathrm{He}_n(x), satisfying the recurrence \mathrm{He}_{n+1}(x) = x \mathrm{He}_n(x) - n \mathrm{He}_{n-1}(x) with \mathrm{He}_0(x) = 1 and \mathrm{He}_1(x) = x.[18][19]
Role in Solving the Problem
Orthogonal polynomials play a pivotal role in resolving the Hamburger moment problem by providing tools to recover the underlying measure \mu from the moment sequence \{s_n\}. A key mechanism is the Christoffel-Darboux formula, which expresses the reproducing kernel K_n(x, y) = \sum_{k=0}^n \frac{p_k(x) p_k(y)}{h_k}, where p_k are the monic orthogonal polynomials and h_k = \int p_k^2 \, d\mu are the squared norms. This kernel facilitates moment recovery, as the measure can be reconstructed in the limit as n \to \infty through the spectral decomposition, with the formula given byK_n(x, y) = \frac{p_{n+1}(x) p_n(y) - p_n(x) p_{n+1}(y)}{h_n (x - y)},enabling the approximation of integrals \int f(x) \, d\mu(x) for polynomials f of degree at most $2n+1.[7]In determinate cases, the asymptotic behavior of the orthogonal polynomials provides a diagnostic for uniqueness. Specifically, for the normalized orthonormal polynomials \tilde{p}_n(x) = p_n(x) / \sqrt{h_n}, the limit \lim_{n \to \infty} \tilde{p}_n(x) / \sqrt{w(x)} relates to the density w of the absolutely continuous part of \mu, where convergence holds pointwise on the support away from singularities. Indeterminacy can be tested via the growth: the problem is determinate if \sum_{k=0}^\infty |\tilde{p}_k(z)|^2 = \infty for some z \in \mathbb{C} \setminus \mathbb{R}, as finite sums indicate multiple measures.[19][20]The connection to continued fractions further links the moments to the Stieltjes transform S(z) = \int \frac{d\mu(x)}{x - z} via the three-term recurrence of the orthogonal polynomials. The Stieltjes transform admits a continued fraction expansion whose partial denominators are the orthogonal polynomials p_n(z), with convergents \frac{p_n(z)}{q_n(z)} approximating S(z), where q_n satisfy a parallel recurrence. This representation allows extraction of the recurrence coefficients from the moments and vice versa, aiding in the characterization of \mu.[21]In the indeterminate case, resolution involves different measures \mu corresponding to distinct analytic continuations of the moment-generating function (or Stieltjes transform) beyond its power series expansion. The Nevanlinna parametrization expresses these via entire functions A(z), B(z), C(z), D(z) of minimal exponential type, constructed from series involving the orthogonal polynomials, such as B(z) = \sum_{n=0}^\infty \frac{p_n(z)^2}{h_n}, which converge in the upper half-plane for indeterminate problems. Each parameter in the Nevanlinna family yields a unique continuation matching the moments.[22]Finally, orthogonal polynomials enable practical applications through Gaussian quadrature rules on \mathbb{R}, approximating \int f(x) \, d\mu(x) \approx \sum_{j=1}^n w_j f(\xi_j), where \xi_j are the zeros of p_n and weights w_j = 1 / \sum_{k=0}^{n-1} \frac{p_k(\xi_j)^2}{h_k}. This provides exact integration for polynomials up to degree $2n-1, with convergence to the true integral as n \to \infty in determinate cases.[7]
Comparisons and Extensions
Relation to Stieltjes and Hausdorff Problems
The Stieltjes moment problem is a restriction of the Hamburger moment problem to measures supported on the non-negative real line [0, \infty). Given a sequence of moments s_n = \int_0^\infty x^n \, d\mu(x) for n = 0, 1, 2, \dots, the problem seeks a positive measure \mu on [0, \infty) realizing these moments. A necessary and sufficient condition for the existence of such a measure is the positive definiteness of both the Hankel matrices H_n = (s_{i+j})_{i,j=0}^n and the shifted Hankel matrices H_n^{(1)} = (s_{i+j+1})_{i,j=0}^n for all n.[23][3]In contrast, the Hausdorff moment problem concerns measures supported on the compact interval [0, 1], with moments s_n = \int_0^1 x^n \, d\mu(x). The existence of a representing measure is equivalent to the sequence \{s_n\} being completely monotonic, meaning that the forward finite differences satisfy (-1)^k \Delta^k s_n \geq 0 for all n, k \geq 0, where \Delta s_n = s_n - s_{n+1} and \Delta^{k+1} s_n = \Delta(\Delta^k s_n). Unlike the Hamburger and Stieltjes cases, the Hausdorff problem is always determinate, admitting a unique solution due to the compactness of the support, which prevents the proliferation of multiple measures.[24][3]The Hamburger problem generalizes both by allowing support on the full real line \mathbb{R}, where existence requires only the positive definiteness of the Hankel matrices H_n, without the additional shifted condition needed for Stieltjes. This broader support enables indeterminacy in the Hamburger case, where multiple measures may share the same moments, a phenomenon possible but less frequent in Stieltjes (as it inherits Hamburger indeterminacy only when both the original and shifted problems are indeterminate) and absent in Hausdorff. Every Stieltjes moment sequence is a valid Hamburger sequence, since a measure on [0, \infty) is also supported on \mathbb{R}, but the converse fails; for instance, the moments of the uniform distribution on [-1, 1], given by s_n = \frac{1}{n+1} for even n and $0 for odd n > 0, solve a determinate Hamburger problem but cannot arise from a Stieltjes measure due to the negative support.[24][23][3]
Extensions to Other Settings
The strong Hamburger moment problem extends the classical formulation by considering bi-infinite sequences of moments s_n = \int_{\mathbb{R}} \lambda^n \, d\rho(\lambda) for n \in \mathbb{Z}, where \rho is a positive Borel measure on \mathbb{R}. This setup corresponds to moments indexed by m_{j+k} with j, k \in \mathbb{Z}, enabling the representation of Laurent polynomials \sum_{n \in \mathbb{Z}} c_n \lambda^n in the associated inner product space. A sequence \{s_n\} solves the problem if it is positive definite, meaning \sum_{j,k \in \mathbb{Z}} s_{j+k} f_j \overline{f_k} \geq 0 for all finite-support sequences \{f_n\} \subset \mathbb{C}, with uniqueness under suitable Carleman-type growth conditions. Orthogonal Laurent polynomials, generated via a Gram-Schmidt process on this space, play a central role in solving the problem and characterizing representing measures.The multivariate Hamburger moment problem generalizes to measures on \mathbb{R}^d for d \geq 2, where moments are multi-indices \alpha = (\alpha_1, \dots, \alpha_d) \in \mathbb{N}_0^d with s_\alpha = \int_{\mathbb{R}^d} x^\alpha \, d\mu(x) and \mu a positive Borel measure. Positive definiteness is ensured by the complete monotonicity of block Hankel matrices H_k = (s_{\alpha + \beta})_{|\alpha|, |\beta| \leq k} \succeq 0, which are structured as block matrices reflecting the multi-dimensional indexing. Uniqueness criteria, such as density of polynomials in weighted L^1(\mathbb{R}^d, w \, d\lambda) spaces, extend classical Carleman-type conditions but are more intricate due to the geometry of \mathbb{R}^d. This framework applies to multidimensional orthogonal polynomials and quadrature formulas.Extensions to non-positive measures relax the positivity requirement, allowing signed or complex Borel measures on \mathbb{R} or subsets thereof. For signed measures, a sequence solves the signed Hamburger problem if the associated Hankel matrices admit a decomposition into positive and negative parts, linking to indefinite inner products and Krein spaces. Complex measures extend this further, with moments s_n = \int_{\mathbb{C}} z^n \, d\mu(z) requiring analytic continuation properties, often addressed via truncated versions where only finitely many moments are given up to degree $2n. The truncated complex Hamburger problem, for instance, seeks representing measures on the complex plane with flat extensions of moment matrices to positive semidefiniteness. These variants connect to optimization and control theory through semidefinite programming relaxations.In the indeterminate case of the classical Hamburger problem, all representing measures have discrete support on a countable subset of \mathbb{R}, with atoms accumulating only at infinity. This extends to moment problems restricted to countable supports, such as lattice points or specific discrete sets, where uniqueness fails if the support allows multiple mass distributions matching the moments. Applications include orthogonal Laurent polynomials on discrete supports excluding zero, facilitating expansions in bases like \{\lambda^n : n \in \mathbb{Z}\} for measures on \mathbb{Z} \setminus \{0\}. Such extensions arise in spectral theory of difference operators and quadrature on discrete grids.Post-2000 developments link the Hamburger moment problem to free probability and random matrix theory, where spectral measures of non-commuting operators or empirical eigenvalue distributions solve indeterminate moment sequences via free convolutions. In free probability, moments of freely independent variables correspond to Hankel-positive sequences, with indeterminacy reflecting multiple free cumulant expansions; this models asymptotic freeness in large random matrices, as in the Marchenko-Pastur law. Indeterminate cases also appear in quantum theory, particularly in the spectral multiplicity of self-adjoint operators on Hilbert spaces, where multiple measures correspond to infinite-dimensional representations in quantum mechanics, such as position operators with non-unique cyclic vectors.