Fact-checked by Grok 2 weeks ago

Chi distribution

The chi distribution, denoted \chi(k), is a continuous in and statistics, representing the distribution of the Euclidean norm (or positive ) of a k-dimensional whose components are standard normal random variables, where k > 0 is the parameter. Its is given by f(x; k) = \frac{2^{1 - k/2} x^{k-1} e^{-x^2/2}}{\Gamma(k/2)}, \quad x > 0, where \Gamma denotes the , and the distribution is supported on the positive real line. The mean of the chi distribution is \mu = \sqrt{2} \cdot \frac{\Gamma((k+1)/2)}{\Gamma(k/2)}, while the variance is \sigma^2 = k - \mu^2. The mode occurs at \sqrt{k-1} for k > 1, and higher moments can be expressed using the as E[X^r] = 2^{r/2} \cdot \frac{\Gamma((k+r)/2)}{\Gamma(k/2)}. The lacks a simple closed form but can be expressed using the regularized lower : F(x; k) = P\left(\frac{k}{2}, \frac{x^2}{2}\right), where P(a, z) = \frac{\gamma(a, z)}{\Gamma(a)} and \gamma is the lower . Special cases of the chi distribution include the when k=1, the (with scale parameter 1) when k=2, and the when k=3. It is closely related to the , as the square of a chi-distributed follows a with k . In applications, the chi distribution appears in for modeling the magnitude of vectors from populations, in via the Rayleigh case for envelope detection of narrowband signals, and in physics through the Maxwell–Boltzmann form for speeds of particles; it also supports reliability analysis and numerical methods for finding critical values in hypothesis testing.

Introduction

Definition and Interpretation

The chi distribution with k degrees of freedom is the of the positive of a chi-squared with k . This arises naturally in statistical contexts where the magnitude of deviations or errors is of interest, transforming the non-negative chi-squared values into a supported on the positive reals. A key interpretation of the chi distribution is as the Euclidean norm (length) of a k-dimensional vector whose components are independent standard normal random variables. Geometrically, it represents the from the to a point in k-dimensional , where the point is randomly sampled from a centered at the with identity . This connection provides intuition for applications in fields like and physics, where such norms model radial distances in high-dimensional noise.

Parameters and Support

The chi distribution is parameterized by a single positive k > 0, which represents the and can take non-integer values. A random variable X following the chi distribution with parameter k, denoted X \sim \chi(k), has support on the interval [0, \infty), with probability zero for all x < 0. As a continuous probability distribution, P(X = 0) = 0 holds for any k > 0. The probability density approaches zero as x \to \infty. The standard form of the chi distribution assumes unit scale, arising as the norm of a k-dimensional of standard normal random variables; more general variants include non-central and scaled chi distributions, though these are not parameterized in the standard case.

Mathematical Formulation

Probability Density Function

The probability density function (PDF) of the chi distribution with k degrees of freedom is given by f(x; k) = \frac{2^{1 - k/2} x^{k-1} e^{-x^2 / 2}}{\Gamma(k/2)}, \quad x \geq 0, and f(x; k) = 0 otherwise, where \Gamma denotes the gamma function and k > 0. This PDF arises as the distribution of the Euclidean norm of a k-dimensional vector of independent standard normal random variables, or equivalently, as the square root of a chi-squared random variable with k degrees of freedom. To derive it, consider a random variable Y \sim \chi^2(k) with PDF g(y; k) = \frac{y^{k/2 - 1} e^{-y/2}}{2^{k/2} \Gamma(k/2)}, \quad y \geq 0. Apply the transformation X = \sqrt{Y}, so Y = X^2 and the Jacobian is |dy/dx| = 2x. The PDF of X is then f(x; k) = g(x^2; k) \cdot 2x = \frac{(x^2)^{k/2 - 1} e^{-x^2/2}}{2^{k/2} \Gamma(k/2)} \cdot 2x = \frac{2^{1 - k/2} x^{k-1} e^{-x^2 / 2}}{\Gamma(k/2)}, for x \geq 0. The formula is normalized such that \int_0^\infty f(x; k) \, dx = 1. To verify, substitute z = x^2, so dz = 2x \, dx and x \, dx = dz/2, yielding \int_0^\infty f(x; k) \, dx = \int_0^\infty \frac{z^{k/2 - 1} e^{-z/2}}{2^{k/2} \Gamma(k/2)} \, dz = \frac{1}{\Gamma(k/2)} \int_0^\infty \frac{z^{k/2 - 1} e^{-z/2}}{2^{k/2}} \, dz. The integral is the gamma function definition scaled: \int_0^\infty z^{k/2 - 1} e^{-z/2} \, dz = 2^{k/2} \Gamma(k/2), confirming the total probability is 1. The shape of the PDF varies with k: for small k (e.g., $1 < k < 2), it is right-skewed and increasing to a mode before decreasing; as k increases, the distribution becomes less skewed and more symmetric, approaching a for large k (e.g., k \geq 90).

Cumulative Distribution Function

The cumulative distribution function (CDF) of the chi distribution with k is given by F(x; k) = \frac{\gamma\left(\frac{k}{2}, \frac{x^2}{2}\right)}{\Gamma\left(\frac{k}{2}\right)}, for x \geq 0, where \gamma(s, z) denotes the lower incomplete gamma function and \Gamma(s) is the gamma function. This expression arises because the chi distribution is the distribution of the positive square root of a chi-squared random variable with k , so F(x; k) equals the CDF of the chi-squared distribution evaluated at x^2. An equivalent form uses the upper incomplete gamma function \Gamma(s, z): F(x; k) = 1 - \frac{\Gamma\left(\frac{k}{2}, \frac{x^2}{2}\right)}{\Gamma\left(\frac{k}{2}\right)}, since \gamma(s, z) + \Gamma(s, z) = \Gamma(s). The gamma form is commonly used due to the direct connection to the chi-squared distribution. To derive the CDF, integrate the probability density function of the chi distribution from 0 to x: F(x; k) = \int_0^x f(t; k) \, dt, where f(t; k) is the PDF. Substitute u = t^2 / 2, so du = t \, dt and t^{k-1} \, dt = 2^{(k/2)-1} u^{(k/2)-1} \, du. This transforms the integral into the form of the CDF of a gamma distribution with shape parameter k/2 and scale 2 (equivalent to the chi-squared CDF at x^2), yielding the incomplete gamma expression above. The quantile function, or inverse CDF F^{-1}(p; k), has no closed-form expression and must be computed numerically, often by inverting the chi-squared quantile and taking the square root or using root-finding methods on the incomplete gamma function. This numerical inversion is commonly employed in statistical software for generating random variates from the chi distribution via the inverse transform sampling method.

Generating Functions

The moment-generating function (MGF) of the chi distribution with k degrees of freedom is M(t; k) = \, _1F_1\left(\frac{k}{2}; \frac{1}{2}; \frac{t^2}{2}\right) + t\sqrt{2} \frac{\Gamma\left(\frac{k+1}{2}\right)}{\Gamma\left(\frac{k}{2}\right)} \, _1F_1\left(\frac{k+1}{2}; \frac{3}{2}; \frac{t^2}{2}\right), where _1F_1(\cdot; \cdot; \cdot) denotes of the first kind. This expression is obtained by evaluating the defining integral M(t; k) = \mathbb{E}[e^{tX}] = \int_0^\infty e^{tx} f(x; k) \, dx, where f(x; k) is the probability density function of the chi distribution; substituting the PDF and performing the integration yields the hypergeometric form through known integral representations of the confluent hypergeometric function. The MGF is defined for t in the interval where the integral converges, consistent with the distribution's support on [0, \infty). The characteristic function (CF) of the chi distribution is similarly given by \phi(t; k) = \, _1F_1\left(\frac{k}{2}; \frac{1}{2}; -\frac{t^2}{2}\right) + it\sqrt{2} \frac{\Gamma\left(\frac{k+1}{2}\right)}{\Gamma\left(\frac{k}{2}\right)} \, _1F_1\left(\frac{k+1}{2}; \frac{3}{2}; -\frac{t^2}{2}\right), where i = \sqrt{-1} is the imaginary unit; this follows analogously from the integral definition \phi(t; k) = \mathbb{E}[e^{itX}] = \int_0^\infty e^{itx} f(x; k) \, dx, leading to the same hypergeometric structure via series expansion.

Properties

Moments and Cumulants

The raw moments of the chi distribution with k > 0 degrees of freedom are given by \mu_n = E[X^n] = 2^{n/2} \frac{\Gamma\left(\frac{k + n}{2}\right)}{\Gamma\left(\frac{k}{2}\right)}, where \Gamma denotes the . This expression arises from the integral representation of the expectation using the and properties of the . The , or first raw , is \mu = E[X] = \sqrt{2} \frac{\Gamma\left(\frac{k + 1}{2}\right)}{\Gamma\left(\frac{k}{2}\right)}. The second raw is E[X^2] = k, since X^2 follows a with k . Consequently, the is \sigma^2 = \operatorname{Var}(X) = k - \mu^2. These central characterize the location and scale of the , with the increasing with k and the reflecting the spread relative to the chi-squared parent. The third raw moment is E[X^3] = 2\sqrt{2} \frac{\Gamma\left(\frac{k + 3}{2}\right)}{\Gamma\left(\frac{k}{2}\right)}. The \gamma_1 = \frac{E[(X - \mu)^3]}{\sigma^3} is positive for all k > 0, indicating right-skewness, and decreases toward zero as k increases, reflecting the distribution's approach to . The third E[(X - \mu)^3] is computed using the raw moments via E[X^3] - 3\mu k + 2\mu^3. The fourth raw moment is E[X^4] = k(k + 2). The excess \gamma_2 = \frac{E[(X - \mu)^4]}{\sigma^4} - 3 is positive for finite k and approaches 0 as k \to \infty, consistent with the central limit theorem's approximation for large . The fourth E[(X - \mu)^4] is computed using the raw moments via E[X^4] - 4\mu E[X^3] + 6\mu^2 k - 3\mu^4. The cumulants \kappa_r of the chi distribution are derived from the cumulant-generating function, \log M(t), where M(t) is the moment-generating function. Only the first two cumulants have simple closed forms: \kappa_1 = \mu (the mean) and \kappa_2 = \sigma^2 (the variance); higher-order cumulants lack elementary expressions and require evaluation via the full moment-generating function, which involves modified Bessel functions.

Entropy

The differential entropy H(X) of a continuous X with f(x) is given by the H(X) = -\int_0^\infty f(x) \ln f(x) \, dx. For the chi distribution with k > 0 , the PDF is f(x) = \frac{2^{1 - k/2} x^{k-1} e^{-x^2/2}}{\Gamma(k/2)}, \quad x > 0. Substituting this into the entropy yields \ln f(x) = (1 - k/2) \ln 2 + (k-1) \ln x - x^2/2 - \ln \Gamma(k/2). The then becomes H(X) = -(1 - k/2) \ln 2 - (k-1) \mathbb{E}[\ln X] + \frac{1}{2} \mathbb{E}[X^2] + \ln \Gamma(k/2). Using the known moments and properties of the distribution, with \mathbb{E}[X^2] = k and \mathbb{E}[\ln X] = \frac{1}{2} \left( \psi\left( \frac{k}{2} \right) + \ln 2 \right), where \psi is the , this simplifies to H(X) = \frac{k}{2} - \frac{1}{2} \ln 2 + \ln \Gamma\left( \frac{k}{2} \right) - \frac{k-1}{2} \psi\left( \frac{k}{2} \right). These expressions are derived by evaluating the expectations through and leveraging the representations of the gamma and s, such as \mathbb{E}[\ln X] expressed via the after substitution and change of variables relating to the . The entropy H(X) increases with k, as larger broaden the support and increase the uncertainty of the distribution. For the special case k=2 (corresponding to the ), H(X) \approx 0.942.

Mode, Median, and Approximations

The of the chi distribution with parameter k > 1 is located at \sqrt{k - 1}, obtained by setting the derivative of the to zero, which yields the equation x^2 = k - 1. For $0 < k \leq 1, the is at 0, as the density is monotonically decreasing. This location closely approximates the mean for large k. The median of the chi distribution has no closed-form expression and must generally be computed numerically from the cumulative distribution function. For practical purposes, an approximation for the median m is m \approx \sqrt{k - 2/3}, which provides reasonable accuracy even for moderate k. This follows from applying the square root to the corresponding approximation for the median of the related . For large values of k, the chi distribution is well-approximated by a normal distribution via the central limit theorem applied to the sum of squares of the underlying standard normal variables, or equivalently through the delta method transformation of the chi-squared approximation. Specifically, a chi-distributed random variable X satisfies X \approx \mathcal{N}\left(\sqrt{k - 1/2}, 1/2\right), where the mean \sqrt{k - 1/2} refines the leading-order term \sqrt{k} and the variance approaches $1/2 independently of k. This normal approximation is particularly useful for tail probabilities and confidence intervals when k \gtrsim 30. The median and mode also align closely with this mean under the approximation. The upper tail behavior for fixed k as x \to \infty follows from asymptotic analysis of the incomplete gamma function in the cumulative , yielding P(X > x) \sim \frac{x^{k-2} e^{-x^2/2}}{2^{k/2-1} \Gamma(k/2)}. This Mills ratio-like approximation arises because the hazard rate of the is asymptotically x, making the comparable to the density divided by x. It is effective for computing large-deviation probabilities without .

Connection to Chi-Squared

The chi distribution with k arises as the distribution of the of a following a central with the same k . Specifically, if Y \sim \chi^2(k), then X = \sqrt{Y} \sim \chi(k). This relationship can be established through a change-of-variable transformation on the . Assuming the density of Y is known, the density of X is obtained by substituting y = x^2 and applying the Jacobian factor of $2x, yielding the standard form of the chi density after simplification. The parameter k, representing in both distributions, corresponds to the dimensionality underlying their constructions from variables. For the , Y is the of k standard random variables, Y = \sum_{i=1}^k Z_i^2 where Z_i \sim N(0,1). Consequently, for the chi distribution, X = \sqrt{Y} = \left( \sum_{i=1}^k Z_i^2 \right)^{1/2} represents the (or ) of the same k-dimensional standard . This connection extends to the noncentral case, where if Y \sim \chi^2(k, \lambda) follows a with k and noncentrality parameter \lambda > 0, then X = \sqrt{Y} follows a noncentral chi distribution with parameters k and \lambda. However, the central case (\lambda = 0) is the primary focus here, reducing to the standard chi and s. In , the underpins chi-squared tests for goodness-of-fit, , and variance, leveraging its asymptotic properties under the . In contrast, the chi distribution emerges in contexts involving root-mean-square errors or magnitudes of normal errors, such as in or measurement precision analysis, where the transformation normalizes squared error aggregates.

Special Cases and Generalizations

The chi distribution with one degree of freedom is known as the , which arises as the of a standard normal random variable, and its is given by f(x) = \sqrt{\frac{2}{\pi}} e^{-x^2/2}, \quad x \geq 0. For two degrees of freedom, the chi distribution corresponds to the [Rayleigh distribution](/page/Rayleigh distribution), commonly applied in for modeling the magnitude of vectors with normally distributed components, with f(x) = x e^{-x^2/2}, \quad x \geq 0. The case of three degrees of freedom yields the Maxwell–Boltzmann distribution (up to a scale factor), which describes the speeds of particles in an ideal gas, and its probability density function is f(x) = \sqrt{\frac{2}{\pi}} x^2 e^{-x^2/2}, \quad x \geq 0. A key generalization is the non-central chi distribution, which extends the standard form to the Euclidean norm of a multivariate normal vector with non-zero mean, accounting for a non-centrality parameter that shifts the distribution. The scaled chi distribution incorporates a σ > 0 by multiplying the standard chi by σ, adjusting the distribution for varying variances in the underlying normals. For integer k, the of the chi distribution admits closed-form simplifications: even k values involve finite sums of probabilities, while odd k values incorporate the alongside exponential terms.

Applications and Estimation

Applications in Statistics and Physics

In statistics, the chi distribution arises as the of the sample deviation when sampling from a , where the scaled sample deviation follows a chi distribution with n-1 , specifically \sqrt{(n-1)} s / \sigma \sim \chi_{n-1}, with s denoting the sample deviation and \sigma the deviation. This connection enables the construction of confidence intervals for the variance using the related , as the square of the scaled sample deviation yields a chi-squared , allowing bounds on \sigma^2 via the transformation for deviation intervals. In physics, the chi distribution with three describes the Maxwell–Boltzmann speed distribution for the speeds of particles in an at , where the for molecular speeds aligns with the chi form scaled by and parameters. In wireless communications, the —a special case of the chi distribution with two —models the envelope of received signals in channels, capturing amplitude variations due to without a dominant line-of-sight path. In engineering applications, the chi distribution characterizes root-mean-square (RMS) noise levels in systems with , as the RMS value of a follows a scaled chi distribution, aiding in predictions and system performance analysis. For systems, detection thresholds are set using the chi distribution to model signal envelopes in Gaussian clutter, particularly for fluctuating targets where the probability of detection is computed from chi-distributed amplitudes to maintain constant false alarm rates. Historically, the chi distribution relates indirectly to Karl Pearson's , introduced in 1900 for assessing goodness-of-fit in data, where the of the provides a measure of magnitude in early 20th-century statistical .

Parameter Estimation Methods

The maximum likelihood estimator (MLE) for the degrees of freedom parameter k of the chi distribution is obtained by maximizing the log-likelihood function \ell(k) = \sum_{i=1}^n \log f(x_i; k), where f(x; k) is the of the chi distribution. This leads to the transcendental equation involving the : \psi\left(\frac{k}{2}\right) = \frac{1}{n} \sum_{i=1}^n \log(x_i^2) - \ln 2. Since no closed-form solution exists, numerical methods such as Newton-Raphson iteration are required to solve for \hat{k}. The method of moments provides an alternative approach by equating the theoretical of the chi distribution to the sample . The is given by \mathbb{E}[X] = \sqrt{2} \frac{\Gamma((k+1)/2)}{\Gamma(k/2)}, so the estimating equation is \bar{x} = \sqrt{2} \frac{\Gamma((k+1)/2)}{\Gamma(k/2)}, which must be solved iteratively for \hat{k}, often using initial approximations like \hat{k} \approx \bar{x}^2. For refinement, a second moment can be matched, leveraging the fact that \mathbb{E}[X^2] = k, yielding \hat{k} = \frac{1}{n} \sum_{i=1}^n x_i^2, which is unbiased but less efficient than MLE for small samples. The MLE \hat{k} is biased downward for small sample sizes n, with the bias becoming negligible as n increases. An approximate bias correction can be applied using higher-order expansions involving the . The asymptotic variance of \hat{k} is \frac{4}{n \trigamma(k/2)}, derived from the inverse , providing standard errors for inference. For large k, this approximates \frac{2k}{n}. To assess the fit of the estimated chi distribution, goodness-of-fit tests such as the Kolmogorov-Smirnov () test can be employed. The test compares the empirical (CDF) of the sample to the theoretical CDF F(x; \hat{k}) of the chi distribution, with the D_n = \sup_x | \hat{F}_n(x) - F(x; \hat{k}) |, which under the converges to a known for large n. Rejection occurs if D_n exceeds critical values from Kolmogorov's .

References

  1. [1]
    AUX-97 March 25, 1997 DATAPLOT Reference Manual PURPOSE ...
    Mar 25, 1997 · The chi-distribution includes several distributions as special cases. If v is 1, the chi-distribution reduces to the half-normal distribution.
  2. [2]
    [PDF] Chi distribution
    ... (chi) distribution with n degrees of freedom, where n is a positive integer. A χ random variable X with n degrees of freedom has probability density function.
  3. [3]
    Chi Distribution
    Chi Distribution. The probability density function and cumulative distribution function are. $\displaystyle P_n(x)$, $\textstyle =$, $\displaystyle {2^{1-n/2}x ...
  4. [4]
    [PDF] Hand-book on STATISTICAL DISTRIBUTIONS for experimentalists
    divide the probability density function into four parts with equal probability content, i.e. ... chi distribution on page 44. • Added section 11 for the doubly ...
  5. [5]
    [PDF] AUX-95 March 25, 1997 DATAPLOT Reference Manual PURPOSE ...
    Mar 25, 1997 · If v is 3, the chi-distribution is a Maxwell-Boltzmann distribution. The generalized Rayleigh distribution is a chi-distribution with a scale ...
  6. [6]
  7. [7]
    [PDF] Karl Pearson a - McGill University
    From these ~1 groups I have found X 2 by the method of this paper. By this reduction of groups I have given. Sir George Airy's curve even a better chance ...
  8. [8]
    scipy.stats.chi — SciPy v1.16.2 Manual
    ### Summary of scipy.stats.chi
  9. [9]
    Chi Distribution -- from Wolfram MathWorld
    The chi distribution with n degrees of freedom is the distribution followed by the square root of a chi-squared random variable.Missing: definition | Show results with:definition
  10. [10]
  11. [11]
  12. [12]
    Parametric Bayesian Estimation of Differential Entropy and Relative ...
    The digamma function is denoted by ψ ( z ) = ▵ d d z ln Γ ( z ) , where Γ is the standard gamma function; and Γ d denotes the standard multi-dimensional gamma ...
  13. [13]
    [PDF] Chi-square and normal inference in high-dimensional multi-task ...
    Jul 16, 2021 · x = √T − 1, the mode of the chi distribution with T degrees of freedom, so that supx>0 fT (x) = (2T/2−1Γ(T/2))−1(T − 1)(T −1)/2e−(T ...
  14. [14]
    [PDF] Learning Similarity Metrics for Numerical Simulations
    Instead, the mode of the chi distribution. √ gc − 1 that closely approximates its mean is employed to achieve a consistent average magnitude of about one ...<|control11|><|separator|>
  15. [15]
    [PDF] Chapter 3 Some Useful Distributions
    The cumulative distribution function (cdf) of Y is. F(y)=1 − exp. −log(1 + ... 3.4 The Chi Distribution. If Y has a chi distribution, Y ∼ χp, then the ...
  16. [16]
    [PDF] Refined normal approximations for the central and noncentral chi ...
    Maximum likelihood estimation in the noncentral chi distribution ... On new formulas for the cumulative distribution function of the noncentral chi-square ...
  17. [17]
    [PDF] AUX-93 March 25, 1997 DATAPLOT Reference Manual PURPOSE ...
    Mar 25, 1997 · The chi-distribution includes several distributions as special cases. If v is 1, the chi-distribution reduces to the half-normal distribution.
  18. [18]
    [PDF] Field Guide to Continuous Probability Distributions - Gavin E. Crooks
    alization of the chi distribution (11.8). NoncentralChi(k, λ) ∼ p ... moment generating function, 158 moments, 158 order statistics, 161. PDF, see ...
  19. [19]
    [PDF] HFNPDF
    Mar 20, 1997 · The standard half-normal probability density function is: (EQ 8-236). The half-normal distribution is the distribution of the variable X=ABS ...
  20. [20]
    8.1.6.2. Weibull - Information Technology Laboratory
    It has CDF and PDF and other key formulas given by: PDF ... The distribution is called the Rayleigh Distribution and it turns out to be ...
  21. [21]
    [PDF] Univariate Distribution Relationships - Rice Statistics
    • The half-normal, Rayleigh, and Maxwell–Boltzmann dis- tributions are special cases of the chi distribution with n = 1, 2, and 3 degrees of freedom ...
  22. [22]
    Noncentral Chi-Square, F and t Distributions - UPRM
    Non-Central Chisquare Distribution. Definition (5.4.1). A random variable X is said to have a chi-square distribution with n degrees of freedom if it has ...
  23. [23]
    Entropy of $\ell^p$ norm of multivariate Gaussian - Cross Validated
    Sep 28, 2023 · ... p norm of X? For the case p=2, the ℓ2 norm is exactly the chi distribution and the entropy of the chi distribution is known. For the case p ...
  24. [24]
    Probability Distribution and Bias of the Sample Standard Deviation
    Jan 25, 2023 · ... sample standard deviation is possible (i.e. Helmert or Chi distribution). In this report, the general formulation of the probability density ...<|separator|>
  25. [25]
    1.3.5.8. Chi-Square Test for the Variance
    The chi-square hypothesis test is defined as: Reject the null hypothesis that the variance is a specified value, σ 0 2 , if where is the critical value of the ...
  26. [26]
    A relation between the Maxwell-Boltzmann and chi-squared ...
    In this article, the median of the three-dimensional Maxwell-Boltzmann speed distribution is solved more simply.
  27. [27]
    [PDF] Rayleigh, Rice and Chi-Squared Distributions
    Rayleigh fading. It therefore has a chi-squared pdf with 2 degrees of freedom, which we found to be exponential. In fact, the chi-squared pdf is simpler any ...
  28. [28]
    Gaussian distribution of crest factor of Gaussian noise
    Dec 20, 2020 · The RMS of the waveform is also a random variable. For a gaussian process distribution, it will follow a chi distribution, and is related to ...
  29. [29]
    [PDF] PROBABILITY OF DETECTION FOR SOME ADDITIONAL ... - DTIC
    This report examines probability of detection for fluctuating targets, defining the chi-square family of signal fluctuations and presenting detection curves.
  30. [30]
    Karl Pearson and the Chi-Squared Test - jstor
    Pearson's paper of 1900 introduced what subsequently became known as the chi-squared test of goodness of fit. The terminology and allusions of 80 years ago ...
  31. [31]