Fact-checked by Grok 2 weeks ago

Noncentral t -distribution

The noncentral t-distribution is a continuous that serves as a of the , incorporating a to model scenarios where the underlying has a nonzero mean, such as when a statistical is false. It is defined as the distribution of the T = \frac{Z + \delta}{\sqrt{W / \nu}}, where Z follows a standard N(0, 1), W follows a central with \nu , \delta is the (a representing the shift in the normal mean), and \nu > 0 is the , with Z and W independent. When \delta = 0, the distribution reduces to the central . The probability density function of the noncentral t-distribution is given by
f(t; \nu, \delta) = \frac{\Gamma\left(\frac{\nu+1}{2}\right)}{\Gamma\left(\frac{\nu}{2}\right) \sqrt{\nu \pi}} \left(1 + \frac{t^2}{\nu}\right)^{-\frac{\nu+1}{2}} {}_1F_1\left(\frac{1}{2}; \frac{\nu}{2}; \frac{\delta^2 (t^2 / \nu)}{2 (1 + t^2 / \nu)}\right) \exp\left(-\frac{\delta^2}{2}\right),
where {}_1F_1 denotes the confluent hypergeometric function, though it lacks a simple closed form and is typically evaluated numerically. The cumulative distribution function involves integrals over the normal and incomplete beta functions and is also computed algorithmically. For moments, the mean exists when \nu > 1 and equals \delta \sqrt{\nu / 2} \cdot \Gamma((\nu - 1)/2) / \Gamma(\nu / 2), while the variance exists when \nu > 2 and is \frac{\nu (1 + \delta^2)}{ \nu - 2 } - \mu^2, where \mu is the mean; for large \nu, the distribution approximates a normal distribution with mean \delta and variance 1.
Key properties include unimodality for most parameter values, with the mode increasing in \delta and generally decreasing in \nu, and the distribution being skewed to the right for positive \delta. It is closely related to other noncentral distributions, such as the noncentral F-distribution (the square of a noncentral t random variable follows a noncentral F with 1 and \nu degrees of freedom), and arises naturally in quadratic forms of normal variables. In statistical applications, the noncentral t-distribution is fundamental for computing the power of one- and two-sample t-tests under alternative hypotheses, where the noncentrality \delta = \sqrt{n} (\mu - \mu_0) / \sigma quantifies size. It also supports the construction of tolerance intervals for normal populations, confidence bounds on process capability indices like C_{pk}, and variables sampling plans in (e.g., MIL-STD-414). Numerical evaluation of its distribution functions is implemented in statistical software libraries, enabling practical use in hypothesis testing and .

Introduction and Definition

Parameters and Support

The noncentral t-distribution is a continuous that generalizes the central Student's t-distribution by introducing a noncentrality representing a shift in the location of the distribution. This accounts for scenarios where the underlying mean is not zero, such as in testing under an . The standard parameterization involves two parameters: the \nu > 0, which controls the shape and tails of the similar to the central t, and the noncentrality parameter \mu \in \mathbb{R}, which determines the extent of the shift. The support of the is the entire set of real numbers \mathbb{R}. The noncentral t-distribution was first introduced in the statistical literature by Neyman and Tokarska in 1936, in the context of calculating type II errors for . Computational advancements, including derivations of its moments, were provided by Hogben, Pinkham, and Wilk around 1961 to facilitate practical applications. Common notation denotes a following this as T \sim t(\nu, \mu) or T \sim \text{NCT}(\nu, \mu). When \mu = 0, the distribution coincides with the central t-distribution.

Stochastic Representation

The noncentral t-distribution arises as the distribution of a defined by the ratio of a shifted standard variate to the square root of an chi-squared variate scaled by its . Specifically, if Z \sim N(0, 1) and V \sim \chi^2_\nu are , then the random variable T = \frac{Z + \mu}{\sqrt{V / \nu}} follows a noncentral t-distribution with parameter \nu > 0 and noncentrality parameter \mu \in \mathbb{R}. This representation highlights the between the numerator (a noncentral ) and the denominator (a scaled central chi-squared), which is essential for deriving the distribution's properties. When \mu = 0, the representation reduces to the central t-distribution, as the numerator becomes a standard normal variate, recovering the classical Student's t under the of zero . In this sense, the noncentral t generalizes the central case by incorporating a nonzero \mu in the numerator, reflecting deviations from centrality. This stochastic form originates in hypothesis testing scenarios, particularly one-sample t-tests, where the test statistic under the alternative hypothesis that the population mean differs from a hypothesized value follows a noncentral t-distribution. Here, \mu represents a standardized measure of the deviation, often expressed as \mu = \sqrt{n} \cdot \delta / \sigma, with n the sample size, \delta the true mean shift, and \sigma the population standard deviation.

Density and Distribution Functions

Probability Density Function

The (PDF) of the noncentral t-distribution with \nu > 0 and noncentrality parameter \delta \in \mathbb{R} is given by the infinite series f(t; \nu, \delta) = \frac{\nu^{\nu/2} e^{-\delta^2/2}}{\sqrt{\pi} \Gamma(\nu/2)} \frac{1}{(\nu + t^2)^{(\nu + 1)/2}} \sum_{i=0}^{\infty} \frac{\Gamma\left[(\nu + i + 1)/2\right]}{i!} \left( \frac{t \delta \sqrt{2}}{\sqrt{\nu + t^2}} \right)^i, where t \in \mathbb{R}, \Gamma(\cdot) denotes the gamma function. This series representation arises from the mixture interpretation of the noncentral in the numerator and reduces to the central t-distribution PDF when \delta = 0. For \delta > 0, the PDF shifts to the right relative to the central case (\delta = 0) and exhibits positive , with the mode typically exceeding zero; the tails asymptotically resemble those of the central t-distribution, decaying as |t|^{-(\nu + 1)}.

Cumulative Distribution Function

The cumulative distribution function (CDF) of the noncentral t-distribution with degrees of freedom \nu > 0 and noncentrality parameter \delta \in \mathbb{R} is defined as F(t; \nu, \delta) = P(T \leq t), where T follows the noncentral t-distribution denoted T \sim t'(\nu, \delta). This function gives the probability that the T does not exceed the value t \in \mathbb{R}. The noncentral t-distribution exhibits a symmetry property in its CDF: F(t; \nu, \delta) = 1 - F(-t; \nu, -\delta). This relation allows computation for negative t by leveraging values for positive arguments with the negated noncentrality parameter, reducing redundant calculations. An explicit representation for the CDF when t \geq 0 involves the standard normal CDF \Phi(\cdot) and the regularized incomplete beta function I_x(a, b), expressed as a Poisson-weighted mixture: F(t; \nu, \delta) = \Phi(-\delta) + \frac{1}{2} \sum_{i=0}^{\infty} \left[ P_i \, I_x\left(i + \frac{1}{2}, \frac{\nu}{2}\right) + \frac{\delta}{\sqrt{2}} \, Q_i \, I_x\left(i + 1, \frac{\nu}{2}\right) \right], where x = t^2 / (\nu + t^2), P_i = e^{-\delta^2/2} (\delta^2/2)^i / i!, and Q_i = e^{-\delta^2/2} (\delta^2/2)^i / \Gamma(i + 3/2). The series converges rapidly for moderate \delta, with terms diminishing after approximately \delta^2/2 + 10 iterations. For t < 0, apply the symmetry property. This form arises from the stochastic representation of the noncentral t as a ratio involving a noncentral normal numerator and central chi-squared denominator, leading to a discrete mixture over Poisson-distributed degrees of freedom. No simple closed-form expression exists for the CDF, necessitating series expansions, numerical integration, or recursive algorithms for evaluation. Early computational efforts, including tables of percentage points and moments to facilitate approximations, were advanced by Hogben et al. (1961), who derived exact moments up to the fourth order to support initial numerical implementations. Modern software libraries implement the above series or equivalent methods for precise computation.

Properties

Moments

The mean of a noncentral t-distributed random variable T with degrees of freedom \nu > 1 and noncentrality parameter \delta is given by \mathbb{E}[T] = \delta \sqrt{\frac{\nu}{2}} \frac{\Gamma\left(\frac{\nu-1}{2}\right)}{\Gamma\left(\frac{\nu}{2}\right)}, where Γ denotes the ; the mean is undefined for \nu ≤ 1. This expression arises from the stochastic representation of T and properties of the . The variance exists for \nu > 2 and is \mathrm{Var}(T) = \frac{\nu (1 + \delta^2)}{\nu - 2} - \frac{\delta^2 \nu}{2} \left[ \frac{\Gamma\left(\frac{\nu-1}{2}\right)}{\Gamma\left(\frac{\nu}{2}\right)} \right]^2. When \delta = 0, this simplifies to the variance of the central t-distribution, \nu / (\nu - 2); for nonzero \delta, the variance exceeds this value, indicating greater spread due to the shift in location. Skewness and kurtosis of the noncentral t-distribution are expressed in terms of ratios of gamma functions and depend on both \nu and \delta, with skewness increasing in magnitude as |\delta| grows, reflecting greater asymmetry relative to the central case. Higher-order moments follow a general form involving confluent hypergeometric functions but exist only conditionally; specifically, the r-th moment is finite for \nu > r + 1, and all moments are finite in the limit as \nu → ∞. Nonzero \delta introduces bias absent in the central t-distribution while amplifying overall variability across moments.

Mode and Asymmetry

The noncentral t-distribution with degrees of freedom \nu > 0 and noncentrality parameter \delta \in \mathbb{R} is strictly unimodal, possessing a single mode where the probability density function (PDF) achieves its global maximum. This unimodality holds regardless of the value of \delta, distinguishing the distribution's bell-shaped form from potential multimodal behaviors in other noncentral cases, and follows from the log-concavity properties of the underlying density derived via total positivity arguments. The mode m is located by solving the equation obtained from setting the first of the PDF to zero, which yields a unique root due to the density's strict increase to the and decrease thereafter. The 's location is strictly increasing in \delta and has the same as \delta, with symmetry m(-\delta) = -m(\delta). As \nu \to \infty, the distribution converges to a with mean \delta and variance 1, so the approaches \delta. The noncentral t-distribution exhibits asymmetry, with positive skewness when \delta > 0, manifesting as a heavier right tail compared to the left, while negative skewness occurs for \delta < 0. This skewness is quantified by the standardized third central moment, \gamma_1 = \mu_3 / \sigma^3, where \mu_3 is the third central moment and \sigma^2 is the variance; explicit expressions for these moments are provided by Hogben et al. (1961). As \nu \to \infty, the skewness \gamma_1 \to 0, recovering the symmetry of the limiting normal distribution. Qualitatively, for fixed \nu, increasing |\delta| shifts the mode away from zero and amplifies the tail imbalance, enhancing the overall asymmetry.

Special and Limiting Cases

When the noncentrality parameter \delta = 0, the noncentral t-distribution with \nu degrees of freedom reduces to the central with the same \nu. As the degrees of freedom \nu \to \infty, the noncentral t-distribution converges in distribution to a normal distribution N(\delta, 1). In the special case where \delta = 0 and \nu \to \infty, the noncentral t-distribution converges to the standard normal distribution N(0,1). For small degrees of freedom, such as \nu = 1, the noncentral t-distribution corresponds to a noncentral Cauchy distribution, which inherits the heavy tails and lack of finite moments from the central Cauchy case. Specifically, no mean or variance exists for \nu = 1, regardless of \delta. As the absolute value of the noncentrality parameter |\delta| \to \infty, the distribution shifts its location indefinitely away from zero, with the cumulative distribution function at any fixed point t approaching 0 for \delta \to +\infty and 1 for \delta \to -\infty, while retaining a spread characteristic of the t-distribution scaled by the degrees of freedom.

Connections to Other Distributions

The noncentral t-distribution with degrees of freedom \nu and noncentrality parameter \delta is stochastically represented as T = \frac{Z + \delta}{\sqrt{\chi^2_\nu / \nu}}, where Z \sim N(0, 1) is independent of the central chi-squared random variable \chi^2_\nu. The numerator Z + \delta follows a normal distribution N(\delta, 1), and thus (Z + \delta)^2 follows a noncentral chi-squared distribution with 1 degree of freedom and noncentrality parameter \delta^2. This relationship highlights how the noncentral t embeds the noncentral chi-squared as a building block in its construction, facilitating derivations of moments and tail probabilities through properties of the chi-squared family. Furthermore, the square of a noncentral t random variable T^2 follows a noncentral F-distribution with parameters 1 and \nu for the degrees of freedom and noncentrality parameter \delta^2. This transformation links the noncentral t directly to the noncentral F, which is itself defined as the ratio of a noncentral chi-squared over a central chi-squared, scaled by their degrees of freedom. Such connections are instrumental in extending univariate results to ratio-based statistics in broader statistical testing frameworks. In higher dimensions, the noncentral multivariate t-distribution generalizes the univariate noncentral t to p-dimensional vector random variables, where the location parameter is a non-zero mean vector \boldsymbol{\delta} \in \mathbb{R}^p and the scale matrix reflects a positive definite covariance structure. This distribution arises as the marginal of a noncentral multivariate normal random vector, conditioned on or integrated over an inverse Wishart-distributed precision matrix derived from a Wishart for the covariance.

Computation and Approximations

Numerical Methods

The cumulative distribution function (CDF) of the noncentral t-distribution can be computed using recursive algorithms based on recurrence relations that express the CDF as a finite sum or continued product, avoiding direct evaluation of complex integrals for improved numerical stability. One widely adopted method, detailed in , employs forward and backward recurrence relations to calculate the CDF by iterating over terms involving ratios of gamma functions and Poisson probabilities, which is particularly efficient for moderate degrees of freedom and noncentrality parameters. This approach, implemented in various statistical libraries, reduces computational overflow risks compared to naive series expansions. The probability density function (PDF) is typically evaluated through numerical quadrature of its integral representation, which integrates the joint density of a normal and chi-squared random variable over the appropriate region. Gauss-Laguerre or adaptive quadrature methods are suitable for this, as the integrand involves the modified Bessel function of the first kind and exhibits exponential decay for large arguments, ensuring convergence with a moderate number of evaluation points. Such techniques are preferred when direct series summation is unstable, particularly for non-integer degrees of freedom. Software implementations provide robust, optimized routines for evaluating the PDF, CDF, and quantiles of the noncentral t-distribution. In R, the functions dt(x, df, ncp) for the PDF, pt(x, df, ncp) for the CDF, and qt(p, df, ncp) for quantiles are available in the base stats package, utilizing C implementations of recurrence-based methods for accuracy up to machine precision. Python's SciPy library offers analogous functions in scipy.stats.nct, including pdf, cdf, ppf for quantiles, and rvs for random variates, built on similar numerical algorithms with support for vectorized computations. MATLAB's Statistics and Machine Learning Toolbox includes nctpdf, nctcdf, and nctinv functions, which employ mixture representations for efficient evaluation across a wide parameter range. Random variates from the noncentral t-distribution are generated primarily using its stochastic representation: draw Z from a standard normal distribution shifted by the noncentrality parameter μ, independently draw a chi-squared variate with ν degrees of freedom, and compute T = Z / sqrt(chi-squared / ν). This direct method is efficient and exact in distribution, requiring only standard generators for normal and chi-squared variates. Acceptance-rejection sampling, using proposals from the central t-distribution scaled to bound the noncentral PDF, serves as an alternative for cases where the direct method incurs high variance, though it is less common due to the reliability of the representation-based approach. Numerical challenges arise particularly with high noncentrality parameters μ or low degrees of freedom ν, where the PDF and CDF can exhibit extreme skewness and near-degeneracy, leading to overflow in intermediate computations or loss of precision in tail probabilities. For instance, large μ shifts the mass far from zero, amplifying conditioning errors in recurrence relations, while small ν increases variability in the denominator, exacerbating instability in quadrature. An overview of these issues highlights the need for parameter transformations, such as scaling by μ, and specialized algorithms like saddlepoint approximations for extreme tails to maintain accuracy.

Asymptotic Approximations

For large degrees of freedom \nu, the noncentral t-distribution with noncentrality parameter \mu can be approximated by a normal distribution N(\mu, 1). This arises from the stochastic representation T = (Z + \mu) / \sqrt{\chi^2_\nu / \nu}, where Z \sim N(0,1) is independent of the \chi^2_\nu random variable; as \nu \to \infty, the denominator converges in probability to 1 by the . A second-order refinement using the delta method on the denominator, combined with corrections for the central component, yields an approximate variance of $1 + \frac{2 + \frac{1}{2} \mu^2}{\nu}, accounting for the variability in \sqrt{\chi^2_\nu / \nu} \approx 1 + \frac{1}{2\nu} from the Taylor expansion around its mean of 1 and the higher moments of the . Higher-order corrections to this normal approximation are provided by Edgeworth expansions, which incorporate cumulants beyond the first two to improve accuracy in the central and tail regions of the cumulative distribution function (CDF). These expansions express the CDF as \Phi(z) + \phi(z) \sum_{k=1}^m p_k(z) \nu^{-k/2}, where \Phi and \phi are the standard normal CDF and PDF, z = (\mu - x)/\sqrt{1 + \mu^2/(2\nu)}, and polynomials p_k depend on the cumulants of the standardized noncentral t. Such expansions are particularly useful for moderate \nu where the plain normal approximation underperforms in tails, and they have been implemented for computational efficiency in statistical software for noncentral distributions. Recent developments include new asymptotic representations derived in 2023, generalizing classical approximations for the Student's t-distribution to the noncentral case. These use integral representations and Watson's lemma to obtain expansions in terms of elementary functions, the complementary error function, and the incomplete gamma function, valid for large \nu (bounded or scaled with \sqrt{\nu}) and large |\mu|. For instance, the CDF for large \nu with bounded \mu and x is approximated as F_\nu(x; \mu) \sim \frac{1}{2} \operatorname{erfc}\left( \frac{\mu - x}{\sqrt{2}} \right) + B(x; \mu) \sum_{k=1}^\infty c_k \nu^{-k}, where B(x; \mu) = x \sqrt{2/\pi} \exp\left( -\frac{1}{2}(\mu - x)^2 \right) and coefficients c_k are explicit polynomials (e.g., c_1 = \frac{1}{8}(x\mu - x^2 - 1)). Numerical validation shows relative errors as low as $10^{-15} with six terms. Saddlepoint approximations offer high accuracy for tail probabilities, essential in power analysis contexts where exact computation is challenging. For the noncentral t, these approximations to the PDF and CDF involve solving for the saddlepoint t_0 from the cumulant generating function and yield expressions like p_\nu(x; \mu) \sim \frac{1}{\sqrt{2\pi \hat{\sigma}^2}} \exp\left( -\frac{(x - \hat{\mu})^2}{2\hat{\sigma}^2} \right) with refined \hat{\mu}, \hat{\sigma}, achieving relative errors below $10^{-4} even for small \nu \geq 5 and moderate \mu. When the denominator noncentrality is zero, the doubly noncentral t saddlepoint reduces to the single noncentral case. Overall, these approximations exhibit improving relative accuracy as \nu \to \infty or |\mu| grows large, with error bounds scaling as O(1/\nu) for normal and Edgeworth types, and exponentially small tails for saddlepoint methods; they provide analytical insight and computational speed over numerical integration for large-scale applications.

Applications

Power Analysis in Hypothesis Testing

The noncentral t-distribution plays a central role in power analysis for t-tests by describing the sampling distribution of the test statistic under the alternative hypothesis, allowing researchers to quantify the probability of correctly rejecting a false null hypothesis. In hypothesis testing, power is defined as 1 minus the type II error rate β, representing the test's sensitivity to detect a true effect of specified magnitude. For the one-sample t-test of H₀: μ = μ₀ versus Hₐ: μ ≠ μ₀, the test statistic follows a noncentral t-distribution with degrees of freedom ν = n - 1 and noncentrality parameter δ = √n (μ - μ₀)/σ when the true mean μ differs from μ₀ by an amount scaled by the standard deviation σ. For a two-sided test at significance level α, the is given by \text{Power} = 1 - F_{\nu, \delta}(t_{1 - \alpha/2; \nu}) + F_{\nu, \delta}(-t_{1 - \alpha/2; \nu}), where F_{\nu, \delta} denotes the cumulative distribution function (CDF) of the noncentral t-distribution with parameters ν and δ, and t_{1 - \alpha/2; \nu} is the (1 - α/2) quantile of the central t-distribution with ν degrees of freedom. This expression accounts for rejection in either tail of the distribution, with the second term typically small when δ > 0 due to the positive shift in the noncentral t. The type II error β is then 1 minus this power value, obtained by integrating the noncentral t over the non-rejection region under the alternative. Sample size determination inverts this framework to find the smallest n such that meets or exceeds a desired level (e.g., 0.8) for a given , α, and σ (often assumed known or estimated). The is commonly expressed as Cohen's d = (μ - μ₀)/σ, yielding δ = d √n, and numerical solution (via software or iteration) is required since no exists. For instance, in a one-sample two-sided t-test with n = 30 (ν = 29), α = 0.05, and d = 0.5 (so δ ≈ 2.74), the is approximately 0.80, as computed using the noncentral t CDF function pt() in . The use of the noncentral t for power calculations in t-tests, including extensions to ANOVA and , became computationally feasible with the publication of efficient algorithms for the noncentral t CDF, notably Lenth's 1989 method, which underpins implementations in statistical software like and .

Tolerance Intervals and Other Uses

The noncentral t-distribution plays a key role in constructing exact one-sided tolerance intervals for normally distributed populations, where the interval must contain at least a specified proportion of the with a given level. For a one-sided lower tolerance limit based on a sample of size n from a normal distribution with unknown and variance, the limit is given by
L = \bar{X} - t_{\gamma; \nu, \delta} \frac{s}{\sqrt{n}},
where \bar{X} is the sample , s is the sample standard deviation, \nu = n-1 is the , t_{\gamma; \nu, \delta} is the \gamma- of the noncentral t-distribution with noncentrality parameter \delta that accounts for uncertainty in the coverage proportion, and \gamma and \delta are determined to satisfy the required and level. This approach provides an exact solution by solving for the noncentrality parameter that ensures the interval's coverage meets the specified \beta proportion with confidence $1 - \alpha, often via numerical methods or tables derived from the distribution's properties.
In quality control, the noncentral t-distribution is used to establish confidence bounds on process capability indices such as C_{pk}, which measures how well a process meets specification limits relative to its variability. For example, lower confidence bounds for C_L or C_U (one-sided capability indices) are derived using the probability P(T_{\nu, \delta} \leq k \sqrt{n})\ = \gamma, where k relates to the capability estimate, ensuring the bound holds with confidence \gamma. Additionally, it supports variables sampling plans, such as those in MIL-STD-414, for acceptance sampling based on normal distributions, where acceptance criteria like \bar{X} - k s \geq L incorporate noncentral t quantiles to control producer and consumer risks (e.g., \alpha = 0.05, \beta = 0.10). In recent applications, the noncentral t-distribution arises in sequential testing frameworks for t-statistics, enabling anytime-valid p-values through supermartingale-based methods that allow continuous monitoring without inflating error rates. For instance, sequential one-sided t-tests under assumptions use the distribution of the t-statistic under alternatives, which follows a noncentral t, to construct adaptive stopping rules that preserve validity across peeking times. Evidential analysis offers an to traditional p-value-based in linear models by quantifying for or against hypotheses, directly incorporating the noncentral t-distribution to model test statistics under hypotheses. In this framework, the posterior distribution of the noncentrality parameter updates evidential measures, such as the probability that the effect size exceeds zero, providing a continuous scale of that avoids dichotomous decisions. Bayesian approaches to testing partial correlations have leveraged Bayes factors formulated to bypass direct reliance on noncentral t assumptions, instead using test statistics like the t-statistic for partial correlations while specifying priors on effect sizes to compute evidence ratios. These Bayes factor functions evaluate support for partial correlation hypotheses by integrating over alternative priors, offering robustness against distributional sensitivities inherent in noncentral t-based classical tests. In multivariate modeling, the noncentral skew t (NCST) distribution extends the univariate noncentral t to accommodate skewed and heavy-tailed data, constructed by scaling a multivariate skew-normal with an inverse gamma-distributed scalar for tail flexibility. This distribution is particularly useful for applications like tumor , where it captures and heavy tails in high-dimensional data through a noncentrality parameter that shifts location while preserving interpretability.

References

  1. [1]
    Noncentral T Distribution - Boost
    The distribution of tν(δ)=X/S is called a noncentral t distribution with degrees of freedom ν and noncentrality parameter δ. This gives the following PDF: where ...Missing: definition | Show results with:definition
  2. [2]
    [PDF] Applications of the Noncentral t–Distribution
    The noncentral t–distribution is intimately tied to statistical inference procedures for samples from normal populations. For simple random sam- ples from a ...
  3. [3]
    Noncentral t Distribution - MATLAB & Simulink - MathWorks
    The noncentral t distribution is a more general case of Student's t distribution, used to calculate the power of the t test.Definition · Background · Examples
  4. [4]
    Noncentral t Distribution: Introduction - Rust - Wiley Online Library
    Sep 29, 2014 · The noncentral distribution provides the power function for a t test, and other applications concern tolerance intervals and confidence ...
  5. [5]
    NoncentralStudentTDistribution - Wolfram Language Documentation
    NoncentralStudentTDistribution[ν,δ] represents a continuous statistical distribution defined and supported over the set of real numbers and parametrized by a ...
  6. [6]
    [PDF] Applications of the Noncentral t–Distributionc O
    May 7, 2008 · The noncentral t–distribution is intimately tied to statistical inference procedures for samples from normal populations.
  7. [7]
    nctpdf - Noncentral t probability density function - MATLAB
    Kotz. Distributions in Statistics: Continuous Univariate Distributions-2. Hoboken, NJ: John Wiley & Sons, Inc., 1970, pp. 201–219.
  8. [8]
    New asymptotic representations of the noncentral $t$-distribution
    Jun 11, 2023 · Using new integral representations, we give new asymptotic expansions for large values of the noncentrality parameter but also for large values ...
  9. [9]
    Noncentral T Distribution - Boost
    The noncentral T distribution is a generalization of the Student's t-distribution, defined as t_ν(δ) = X/S, where X has a normal distribution with mean δ and ...
  10. [10]
    [PDF] Computing discrete mixtures of continuous distributions: noncentral ...
    The main purpose of this article is to provide easy reference to computational algorithms for computing the distribution functions of the noncentral t, ...
  11. [11]
    nctstat - Noncentral t mean and variance - MATLAB - MathWorks
    This MATLAB function returns the mean of and variance for the noncentral t pdf with NU degrees of freedom and noncentrality parameter DELTA.
  12. [12]
  13. [13]
    [PDF] on the unimodality and the bell-shape of noncentral distributions
    For all three distributions the strict unimodality is obtained as a consequence of a sharper result saying that each of the corresponding densities is bell- ...
  14. [14]
    Student's t Distribution - MATLAB & Simulink - MathWorks
    If x is a random sample of size n from a normal distribution with mean μ, then the statistic t = x ¯ − μ s / n , where x ¯ is the sample mean and s is the ...Overview · Cumulative Distribution Function · Inverse Cumulative... · Examples
  15. [15]
    [PDF] Lecture 13: Noncentral χ
    The distribution of. T = X/pU/n is the noncentral t-distribution with degrees of freedom n and noncentrality parameter δ. The t-distribution previously defined ...
  16. [16]
    [PDF] Non-Central Multivariate & Distributions - University of Cape Town
    Apr 7, 2023 · ... investigated. The distribution of the upper non-central matrix T and the non-central inverted matrix T are given in integral form. Page 10 ...<|control11|><|separator|>
  17. [17]
  18. [18]
    Algorithm AS 243 - jstor
    A new algorithm for computing cumulative probabilities of the non-central t distribu- tion is given. It is a useful alternative to algorithm AS 5 (Cooper ...
  19. [19]
    Cumulative Distribution Function of the Noncentral T Distribution
    Russell V. Lenth; Cumulative Distribution Function of the Noncentral T Distribution, Journal of the Royal Statistical Society Series C: Applied Statistics,
  20. [20]
    [PDF] arXiv:1306.5294v2 [stat.CO] 4 Sep 2013
    Sep 4, 2013 · In this paper we suggest an alternative approach for computing the cumulative distribution function (CDF) of the noncentral t-distribution which ...
  21. [21]
    The Student t Distribution - R
    The general non-central t with parameters ( ν , δ ) (\nu, \delta) (ν,δ) = (df, ncp) is defined as the distribution of T ν ( δ ) : = ( U + δ ) / V / ν T_{\nu}(\ ...
  22. [22]
    A Note on Computing Extreme Tail Probabilities of the Noncentral T ...
    Jun 22, 2013 · In this paper we suggest an alternative approach for computing the cumulative distribution function (CDF) of the noncentral t-distribution.Missing: origin | Show results with:origin<|control11|><|separator|>
  23. [23]
    Saddlepoint approximations for the doubly noncentral t distribution
    Mar 1, 2007 · Random variable T is said to follow a doubly noncentral (Student's) t distribution with n degrees of freedom, numerator noncentrality parameter ...
  24. [24]
    Sample Size and Power Calculations using the Noncentral t ...
    Expected power using the normal approximation was calculated by the sampsi command, and expected power using the noncentral t-distribution was calcu- lated ...
  25. [25]
    Statistical Power of t tests | Real Statistics Using Excel
    Describes how to use the noncentral t distribution to compute the power of t tests. Examples and Excel add-in software are provided.
  26. [26]
    None
    ### Summary of Power Calculation Examples for One-Sample t-Test Using Noncentral t
  27. [27]
    7.2.6.3. Tolerance intervals for a normal distribution
    The disadvantage of the non-central \(t\) method is that it depends on the inverse cumulative distribution function for the non-central \(t\) distribution. This ...
  28. [28]
    Parametric Empirical Bayes Tolerance Intervals: Technometrics
    Mar 23, 2012 · The degrees of freedom and noncentrality parameters are obtained by matching the first two moments of the PEB conditional t distribution to ...
  29. [29]
    An Alternative to Hypothesis Testing in Normal Linear Models
    Abstract page for arXiv paper 2408.11672: Evidential Analysis: An Alternative to Hypothesis Testing in Normal Linear Models.
  30. [30]
    Bayes factor functions for testing partial correlation coefficients - arXiv
    In this article, we present Bayes Factor Functions (BFFs) for assessing the presence of partial correlation. BFFs represent Bayes factors ...Missing: noncentral t
  31. [31]
    Flexible Modeling of Multivariate Skewed and Heavy-Tailed Data via ...
    Jul 14, 2025 · We propose a flexible formulation of the multivariate non-central skew t (NCST) distribution, defined by scaling skew-normal random vectors with independent ...