Fact-checked by Grok 2 weeks ago

Cochran's theorem

Cochran's theorem is a key result in statistics that specifies the conditions under which a set of quadratic forms in a multivariate random vector are independent and each follows a . Formulated by William Gemmell Cochran in 1934, the theorem addresses the decomposition of the sum of squares of independent standard random variables into orthogonal s. Specifically, if X_1, \dots, X_n are i.i.d. N(0,1) and \sum_{i=1}^n X_i^2 = \sum_{j=1}^k Q_j where each Q_j = X^T A_j X is a with symmetric A_j of rank r_j, and \sum_{j=1}^k r_j = n, then the Q_j are independent and Q_j \sim \chi^2(r_j). This extends to non-central cases and general covariance structures, with non-centrality parameters determined by the mean vector. The theorem's proof relies on the properties of orthogonal transformations and the invariance of the normal distribution under linear changes, often demonstrated via or of the matrices involved. It builds on earlier work in quadratic forms but provides a clean criterion for independence via rank additivity. In practice, Cochran's theorem underpins many inferential procedures in linear models and analysis of variance (ANOVA). For instance, it justifies the independence between the sample mean and sample variance in normal samples, as the total sum of squares decomposes into mean-related and variance-related components with additive degrees of freedom. In one-way ANOVA, it ensures that the between-group and within-group sums of squares are independent chi-squared random variables (scaled by the error variance), enabling F-tests for equality of means. Similarly, in multiple linear regression, the theorem supports the distribution of the regression sum of squares and error sum of squares, forming the basis for testing overall model significance and lack of fit. Cochran originally applied it to the analysis of covariance, highlighting its role in partitioning variability under covariate adjustments. Extensions of the theorem appear in modern contexts, such as elliptically contoured distributions and high-dimensional data, but the classical version remains central to testing assuming .

Overview and Statement

Historical Background

The formal statement of Cochran's theorem emerged from the work of William G. Cochran, a Scottish who developed it during his graduate studies at Cambridge University and his subsequent position at the Rothamsted Experimental Station. In 1934, Cochran published the theorem in his paper titled "The distribution of quadratic forms in a normal system, with applications to the " in the Proceedings of the Cambridge Philosophical Society. This publication provided a mathematical foundation for the independence and chi-squared distributions of certain quadratic forms under assumptions, addressing gaps in earlier statistical methodologies. Preceding ideas on quadratic forms in normal variables trace back to William Sealy Gosset, publishing under the pseudonym "Student," who in 1923 explored variance partitioning in experimental designs assuming normal errors. In his paper "On testing varieties of cereals," Gosset applied concepts akin to quadratic forms to analyze experimental errors and variety differences in agricultural trials, laying groundwork for later distributional results. Cochran's motivation stemmed from the need to rigorously justify the use of chi-squared distributions in variance analysis, particularly for extending analysis of variance techniques to include covariates. His theorem facilitated this by establishing conditions under which sums of squares behave independently as chi-squared variables, enhancing the reliability of tests in experimental settings. Initial applications of these ideas appeared in early 20th-century work on and experimental design, notably at Rothamsted under R. A. Fisher's influence, where Cochran contributed to practical statistical methods for agricultural research. These developments marked a shift toward more robust in designed experiments, influencing the evolution of modern statistics.

Formal Statement

Cochran's theorem provides a fundamental result on the distribution and independence of quadratic forms in multivariate random variables. Let U = (U_1, \dots, U_N)^T be an N-dimensional random vector where the components U_i are independent and identically distributed as standard , i.e., U_i \sim N(0, 1) for each i = 1, \dots, N. Consider quadratic forms Q_i = U^T B^{(i)} U for i = 1, \dots, k, where each B^{(i)} is an N \times N . Suppose these quadratic forms satisfy \sum_{i=1}^k Q_i = U^T U, which implies \sum_{i=1}^k B^{(i)} = I_N, the N \times N , and let r_i = \rank(B^{(i)}) such that \sum_{i=1}^k r_i = N. Under the conditions that each B^{(i)} is idempotent, i.e., B^{(i)} B^{(i)} = B^{(i)}, and pairwise orthogonal, i.e., B^{(i)} B^{(j)} = 0 for all i \neq j, the theorem states that each Q_i follows a with r_i , Q_i \sim \chi^2_{r_i}, and the Q_i are mutually .

Formulations and Variants

Original Formulation

Cochran's original of the theorem addresses the distribution of quadratic forms constructed from standard normal random variables. Consider Z_1, \dots, Z_n as random variables each distributed as N(0,1). The theorem states that if quadratic forms Q_1, \dots, Q_k satisfy \sum_{i=1}^k Q_i = \sum_{j=1}^n Z_j^2, where each Q_i can be expressed as the sum of r_i squares of linear combinations of the Z_j's, and these linear combinations are mutually orthogonal across the different Q_i's with \sum_{i=1}^k r_i = n, then the Q_i are mutually , and each Q_i follows a with r_i . This decomposition ensures that the , which itself follows a with n , is partitioned into independent components whose degrees of freedom add up to n. The orthogonality condition on the linear combinations guarantees the independence of the resulting squared terms, allowing each Q_i to inherit the chi-squared distribution directly from the properties of independent standard normals.

Matrix Formulation

The matrix formulation of Cochran's theorem extends the result to quadratic forms defined via symmetric idempotent matrices, providing a framework particularly useful for analyzing linear statistical models and multivariate normal distributions. Consider an n-dimensional random vector Y following a Y \sim N_n(0, \sigma^2 I_n), where \sigma^2 > 0 is a scalar variance parameter and I_n is the n \times n . Let A_1, \dots, A_k be symmetric n \times n matrices satisfying \sum_{i=1}^k A_i = I_n. Each matrix A_i is idempotent, meaning A_i^2 = A_i, and has r_i, with the ranks satisfying \sum_{i=1}^k r_i = n. Additionally, the matrices are pairwise orthogonal in the sense that A_i A_j = 0 for all i \neq j. Under these conditions, the quadratic forms Y^T A_i Y / \sigma^2 are independently distributed as chi-squared random variables with equal to the respective ranks: Y^T A_i Y / \sigma^2 \sim \chi^2_{r_i} for each i = 1, \dots, k. This formulation leverages the properties of idempotent matrices, where the rank r_i equals the trace of A_i, to decompose the total sum of squares Y^T Y / \sigma^2 \sim \chi^2_n into independent components, facilitating hypothesis testing in settings like analysis of variance and linear regression. The orthogonality condition ensures the independence of the quadratic forms, as it implies that the corresponding projection subspaces are mutually orthogonal. This matrix-based approach generalizes the original vector formulation to handle complex linear hypotheses in higher dimensions.

Examples and Applications

Sample Mean and Sample Variance

Consider a random sample X_1, X_2, \dots, X_n drawn independently and identically from a N(\mu, \sigma^2). Cochran's theorem provides a framework to decompose the total sum of squared deviations from the mean into components associated with the sample mean and sample variance, establishing their respective chi-squared distributions and mutual . The total sum of squared deviations is given by \sum_{i=1}^n (X_i - \mu)^2, which can be partitioned as \sum_{i=1}^n (X_i - \bar{X})^2 + n (\bar{X} - \mu)^2, where \bar{X} = n^{-1} \sum_{i=1}^n X_i is the sample mean. Normalizing by the variance yields the quadratic forms Q_1 = \sigma^{-2} \sum_{i=1}^n (X_i - \bar{X})^2 and Q_2 = n \sigma^{-2} (\bar{X} - \mu)^2. Under the assumptions of normality and independence, Q_1 \sim \chi^2_{n-1} and Q_2 \sim \chi^2_1. This decomposition aligns with the conditions of Cochran's theorem, as the sum Q_1 + Q_2 = \sigma^{-2} \sum_{i=1}^n (X_i - \mu)^2 \sim \chi^2_n, and the ranks of the associated quadratic forms add to n. The theorem guarantees that Q_1 and Q_2 are independent. In matrix notation, let \mathbf{X} = (X_1, \dots, X_n)^T and \mathbf{1} be the n \times 1 vector of ones. The projection matrices are A_1 = I_n - n^{-1} \mathbf{1} \mathbf{1}^T (idempotent, rank n-1) and A_2 = n^{-1} \mathbf{1} \mathbf{1}^T (idempotent, rank 1), satisfying A_1 + A_2 = I_n and A_1 A_2 = 0. For \mu = 0, the forms become Q_1 = \mathbf{X}^T A_1 \mathbf{X} / \sigma^2 and Q_2 = \mathbf{X}^T A_2 \mathbf{X} / \sigma^2, with the orthogonality ensuring independence. The general case follows by centering \mathbf{X} - \mu \mathbf{1}, as A_1 \mathbf{1} = 0. The sample variance is s^2 = (n-1)^{-1} \sum_{i=1}^n (X_i - \bar{X})^2 = \sigma^2 (n-1)^{-1} Q_1. Since Q_1 is independent of Q_2, and \bar{X} is a function of Q_2, it follows that \bar{X} and s^2 are independent random variables. This result is pivotal for inference procedures, such as constructing confidence intervals for \mu that do not depend on the unknown \sigma^2.

Analysis of Variance

Cochran's theorem plays a central role in the analysis of variance (ANOVA) by justifying the decomposition of the into independent components whose distributions are known under assumptions. In the one-way ANOVA setup with k groups and n total observations, the data are modeled as y_{ij} = \mu_i + \epsilon_{ij} for i = 1, \dots, k and j = 1, \dots, n_i, where \epsilon_{ij} \sim N(0, \sigma^2) independently, and the tests equality of group means \mu_1 = \dots = \mu_k. The SST = \sum_{i=1}^k \sum_{j=1}^{n_i} (y_{ij} - \bar{y}_{..})^2 is partitioned into the between-group SSA = \sum_{i=1}^k n_i (\bar{y}_{i.} - \bar{y}_{..})^2 and the within-group SSE = \sum_{i=1}^k \sum_{j=1}^{n_i} (y_{ij} - \bar{y}_{i.})^2, such that SST = SSA + SSE. This partition corresponds to orthogonal projections in the vector space of observations. The between-group component SSA arises from the projection onto the subspace spanned by the group indicator vectors, which has rank k-1 after centering to account for the overall mean constraint. The within-group component SSE is the projection onto the orthogonal complement, the space of deviations within groups, with rank n - k. These projection matrices A_{SSA} and A_{SSE} satisfy A_{SSA} + A_{SSE} = I (the identity, up to centering) and A_{SSA} A_{SSE} = 0, ensuring orthogonality. Under the normality assumption, Cochran's theorem implies that SSA / \sigma^2 \sim \chi^2_{k-1} and SSE / \sigma^2 \sim \chi^2_{n-k} when the holds (i.e., equal means), and these two quadratic forms are independent due to the of the projections. The theorem guarantees this because the sum of the ranks equals the dimension n, and the projections are idempotent and mutually orthogonal. This distributional result directly yields the F-statistic for testing the : F = \frac{SSA / (k-1)}{SSE / (n-k)} \sim F_{k-1, n-k}, where the numerator estimates \sigma^2 plus a non-centrality term under the , but reduces to the central F under the null. The independence ensures the validity of this ratio as an F-distributed , enabling computation and confidence intervals for variance components in experimental designs.

Variance Estimation in Regression

In the model Y = X\beta + \epsilon, where Y is the n \times 1 response , X is the n \times p full-rank (including an intercept column), \beta is the p \times 1 parameter , and \epsilon \sim N(0, \sigma^2 I_n), Cochran's theorem establishes the distributions of sums of squares central to . The is P = X(X^T X)^{-1} X^T, which is idempotent with rank p. The residual is e = (I - P)Y, and the is SSE = Y^T (I - P) Y = e^T e. Since I - P is idempotent with rank n - p, Cochran's theorem implies that SSE / \sigma^2 \sim \chi^2_{n-p}, a result holding for any true \beta due to the normality of the errors. This justifies the unbiased of the error variance, \hat{\sigma}^2 = SSE / (n - p). The scaling by the n - p ensures E[\hat{\sigma}^2] = \sigma^2, as the of a with \nu is \nu. Moreover, (n - p) \hat{\sigma}^2 / \sigma^2 \sim \chi^2_{n-p}, providing the basis for confidence intervals and hypothesis tests involving \sigma^2. This is pivotal in regression diagnostics and inference, such as calculations for \hat{\beta}. Under the null hypothesis that the slope parameters are zero (i.e., the model reduces to an intercept-only model), the regression sum of squares SSR = Y^T \left( P - n^{-1} \mathbf{1} \mathbf{1}^T \right) Y follows \sigma^2 \chi^2_{p-1}, independent of SSE. Cochran's theorem guarantees this independence because P - n^{-1} \mathbf{1} \mathbf{1}^T and I - P are orthogonal idempotents that sum to the centering matrix I - n^{-1} \mathbf{1} \mathbf{1}^T, partitioning the centered quadratic form Y^T \left( I - n^{-1} \mathbf{1} \mathbf{1}^T \right) Y. This setup enables the overall F-test for the significance of the regression, where the test statistic F = \frac{SSR / (p-1)}{SSE / (n-p)} follows an F-distribution with p-1 and n - p degrees of freedom under the null, testing whether the full model significantly improves fit over the intercept-only model.

Proof and Mathematical Foundations

Key Prerequisites

Understanding Cochran's theorem requires familiarity with several fundamental concepts from and linear algebra, particularly those involving distributions and forms. A is defined as Q = X^T A X, where X is a random following a X \sim N(0, \Sigma), and A is a . When \Sigma = I (the ) and A is idempotent, Q follows a central with equal to the rank of A. In the more general case with non-zero mean \mu, the distribution becomes non-central chi-squared, with the non-centrality parameter determined by \mu^T A \mu. Idempotence of a matrix A means that A^2 = A. For a symmetric idempotent matrix, the eigenvalues are either 0 or 1, and the trace of A, which is the sum of its eigenvalues, equals the rank of A. This property is crucial because the rank determines the degrees of freedom in the associated chi-squared distribution. If two symmetric matrices A and B are orthogonal in the sense that A B = 0, then the quadratic forms X^T A X and X^T B X are independent when X follows a standard X \sim N(0, I). The with k has \phi(t) = (1 - 2 i t)^{-k/2}. This function is instrumental in deriving the distributions of sums of independent chi-squared random variables.

Detailed Proof

To prove Cochran's theorem, consider a random vector \mathbf{X} \sim N(\mathbf{0}, \sigma^2 I_n) and symmetric idempotent matrices A_1, \dots, A_k of ranks r_1, \dots, r_k such that \sum_{i=1}^k A_i = I_n and \sum_{i=1}^k r_i = n, with quadratic forms Q_i = \mathbf{X}' A_i \mathbf{X} for i = 1, \dots, k. The goal is to show that each Q_i / \sigma^2 \sim \chi^2_{r_i} independently, and \sum_{i=1}^k Q_i / \sigma^2 \sim \chi^2_n. First, establish the marginal distribution of each Q_i / \sigma^2. Since A_i is symmetric and idempotent (A_i^2 = A_i), it admits an : there exists an P_i such that P_i^T A_i P_i = \diag(I_{r_i}, 0_{n - r_i}). Let \mathbf{Y}_i = P_i^T \mathbf{X}; then \mathbf{Y}_i \sim N(\mathbf{0}, \sigma^2 I_n) because orthogonal transformations preserve the with identity covariance (up to by \sigma^2). Thus, \frac{Q_i}{\sigma^2} = \mathbf{Y}_i^T \diag(I_{r_i}, 0_{n - r_i}) \mathbf{Y}_i / \sigma^2 = \sum_{j=1}^{r_i} (Y_{i,j} / \sigma)^2, where each Y_{i,j} / \sigma \sim N(0, 1) independently. The sum of r_i independent squared standard normals follows a \chi^2_{r_i} distribution. Next, prove the independence of the Q_i. The conditions imply that the A_i are mutually orthogonal projections: A_i A_j = 0 for i \neq j, as the idempotence and sum-to-identity ensure the column spaces are orthogonal spanning \mathbb{R}^n. Consequently, there exists a single P that simultaneously diagonalizes all A_i, partitioning the basis into orthogonal eigenspaces corresponding to each : P^T A_i P = \diag(0_{s_1}, \dots, I_{r_i}, \dots, 0_{s_k}), where the identity blocks are disjoint and the zeros fill the rest, with \sum s_\ell + \sum r_i = n but effectively the non-overlapping structure covers the full n. Let \mathbf{Y} = P^T \mathbf{X} \sim N(\mathbf{0}, \sigma^2 I_n); the components of \mathbf{Y} are . Then, \frac{Q_i}{\sigma^2} = \sum_{j \in J_i} (Y_j / \sigma)^2, where J_i is the of size r_i for the i-th block, and the J_i are disjoint. Since the sums involve disjoint normals, the Q_i / \sigma^2 are . Moreover, \sum Q_i / \sigma^2 = \sum_{j=1}^n (Y_j / \sigma)^2 \sim \chi^2_n, confirming the additivity.

Extensions and Generalizations

To Non-Normal Distributions

Cochran's theorem extends beyond the multivariate normal distribution to the broader class of elliptically contoured distributions, which encompass distributions with elliptical symmetry such as the multivariate t-distribution and multivariate Cauchy distribution. Under these distributions, if a random vector \mathbf{x} follows an elliptically contoured law with generator function \phi and the associated matrices A and B for quadratic forms \mathbf{x}^\top A \mathbf{x} and \mathbf{x}^\top B \mathbf{x} satisfy orthogonality (AB = 0), idempotence (A^2 = A, B^2 = B), and appropriate ranks k and l, then the quadratic forms are independent, with marginal distributions following a generalized form G_2(k, n-k; \phi) and joint G_3(k, l, n-k-l; \phi), respectively. For the multivariate normal case, \phi(t) = \exp(-t/2), this reduces to the classical chi-squared and F distributions. The key condition enabling this preservation is the elliptical (or spherical after ) symmetry of the distribution, which maintains chi-squared-like properties for idempotent quadratic forms, allowing the to add up appropriately even under non-normal tails. This generalization, developed by Anderson and , highlights how the theorem's core structure—decomposition into independent components—holds without strict normality, provided the symmetry is elliptical. However, for non-elliptically contoured distributions lacking this , such as skewed distributions, the of orthogonal forms fails to hold in general. A prominent is the sample mean \bar{X} = n^{-1} \sum X_i and sample variance S^2 = (n-1)^{-1} \sum (X_i - \bar{X})^2, whose associated projection matrices are orthogonal (AB = 0), yet exhibit non-zero under skewed non- distributions like the chi-squared or lognormal, violating the required by the theorem. This limitation underscores that elliptical is essential for the theorem's distributional results. These extensions find applications in , where normality assumptions are relaxed to accommodate outliers or heavier tails; for instance, using multivariate t-distributions in analysis of variance allows inference under elliptical contoured errors, improving reliability in contaminated datasets.

Recent Revisions and Modern Applications

In the , matrix formulations of Cochran's theorem were revisited to unify idempotence conditions under more general algebraic settings. For instance, (2005) established a further algebraic version that extends the classical conditions for the and of quadratic forms using partial orderings, providing characterizations for partitioned sums of squares in multivariate vectors. This work contributed to subsequent matrix-based developments in linear models. A 2014 revisit by , Khattree, and further refined several versions of the theorem, offering new equalities and inequalities relevant to estimability and in statistical models. Modern applications of the theorem appear in contexts involving orthogonal projections for variance decomposition, such as in and under normality assumptions. In PCA, quadratic forms associated with principal components follow chi-squared distributions, aiding variance decomposition for . Similarly, in variable selection for regression, the independence properties support testing procedures that separate signal and noise variances. These applications highlight the theorem's utility in algorithms where orthogonal decompositions maintain computational tractability, though high-dimensional challenges often require approximations. Computationally, the theorem supports simulations of independent chi-squared random variables for hypothesis testing in linear models through orthogonal transformations to obtain bases for projection spaces, aligning with the degrees-of-freedom partitioning. This approach is particularly valuable in studies for assessing test . Critiques of the theorem in high-dimensional settings center on its reliance on full-rank and multivariate , which break down when the p greatly exceeds the sample size n, leading to dependent quadratic forms or non-chi-squared distributions. In such regimes, the theorem's exact results may overestimate , prompting alternatives like the bootstrap for robust variance estimation and inference in . Bootstrap methods resample residuals to approximate the empirically, offering consistent performance without strict assumptions and proving reliable even when p/n approaches a non-zero constant.

References

  1. [1]
    The distribution of quadratic forms in a normal system, with ...
    Oct 24, 2008 · The object of this paper is to prove the main-relevant results about this distribution. As an application of these results, the theory involved in the method ...
  2. [2]
    [PDF] Lecture 15: Multivariate normal distributions
    We now study the distribution of quadratic forms when X is multivariate normal. ... Theorem N4 (Cochran's theorem). Suppose that X is an n-dimensional ...
  3. [3]
    [PDF] Quadratic forms Cochran's theorem, degrees of freedom, and all that…
    Cochran's theorem tells us about the distributions of partitioned sums of squares of normally distributed random variables. • Traditional linear regression ...Missing: primary source
  4. [4]
    Application of Cochran's theorem to two-way analysis of variance
    Nov 16, 2022 · Index: The Book of Statistical Proofs ▷ Statistical Models ▷ Univariate normal data ▷ Analysis of variance ▷ Cochran's theorem for two-way ANOVA.
  5. [5]
    Cochran's Theorem for Elliptically Contoured Distributions - jstor
    The theory of elliptically contoured distributions has been discussed by many authors including Schoenberg (1938), Lord (1954), Kelker (1970),.Missing: primary | Show results with:primary
  6. [6]
    [PDF] WILLIAM GEMMELL COCHRAN - Biographical Memoirs
    Cochran's first paper (1934), a mixture of algebra and analysis, brought into mathematical statistics an extremely valuable and widely used result, now ...
  7. [7]
    The distribution of quadratic forms in a normal system, with ...
    The distribution of quadratic forms in a normal system, with applications to the analysis of covariance. By W. G. COCHRAN, B.A.,. St John's College.
  8. [8]
    [PDF] Cochran's Theorem, Rank Additivity, and Tripotent Matrices. - DTIC
    Dec 9, 1980 · In Section 4 we discuss the applications of these algebraic theorems to statistics. 2. Some Proofs. 2.1. Proof of Theorem 1.1. To prove (1.7) ...<|control11|><|separator|>
  9. [9]
  10. [10]
    [PDF] Chapter 5 Distribution Theory
    We now add the assumption of normality to the linear model to get distribu- tions of functions of quadratic forms. Theorem 5.10 Consider the standard coordinate ...
  11. [11]
    [PDF] Design and Analysis of Experiments - University of Alberta
    The one-way ANOVA is concerned with comparing the between group ... of squares are all chi squared distributed by Cochran's theorem. The degrees ...
  12. [12]
    Linear Models - Shayle R. Searle - Google Books
    Aug 24, 2012 · This 1971 classic on linear models is once again available--as a Wiley Classics Library Edition. It features material that can be understood ...
  13. [13]
    Normal distribution - Quadratic forms - StatLect
    This lecture discusses quadratic forms involving normal random vectors, where certain forms have a Chi-square distribution.Review of relevant results from... · Trace of a matrix · Quadratic forms in standard...
  14. [14]
    [PDF] ON THE DISTRIBUTION OF A QUADRATIC FORM IN NORMAL ...
    It is a well-known theorem in linear models that the idempotency of a matrix is a sufficient and necessary condition for a quadratic form in normal variates to ...
  15. [15]
    [PDF] Eigenvalues and Eigenvectors
    A matrix A is idempotent if and only if all its eigenvalues are either. 0 or 1. The number of eigenvalues equal to 1 is then tr(A). Proof: If A is idempotent, λ ...Missing: rank | Show results with:rank
  16. [16]
    Chi-square distribution | Mean, variance, proofs, exercises - StatLect
    A random variable has a Chi-square distribution if it can be written as a sum of squares of independent standard normal variables.
  17. [17]
    [PDF] Distributions of Quadratic Forms and Cochran's Theorem for ... - DTIC
    Cochran, W. G. (1934), The distribution of quadratic forms in a normal system, with applications to the analysis of covariance, Proc. Cambridge Philos. Soc ...
  18. [18]
    [PDF] Correlation between the Sample Mean and Sample Variance
    Nov 1, 2008 · This article obtains a general formula to find the correlation coefficient between the sample mean and variance. Several particular results for ...
  19. [19]
    [PDF] Theory and Applications of Elliptically Contoured and Related ...
    Sep 24, 1990 · The distributions of their quadratic forms and associated Cochran's theorem are presented there as well. Some results of estimation of ...<|control11|><|separator|>
  20. [20]
    Cochran's statistical theorem revisited - ScienceDirect.com
    Introduction. Cochran's statistical theorem on quadratic forms in normal random variables, Cochran (1934, p. 179), is well known. Let be an n × 1 random vector ...Missing: primary | Show results with:primary
  21. [21]
    Goodness-of-fit tests for high-dimensional Gaussian linear models
    high-dimensional setting. We believe that this issue is significant for ... m are independent by Cochran's theorem as they correspond to projections ...
  22. [22]
    [PDF] High-dimensional statistics - Ceremade
    High-dimensional statistics. Remark 1.3. Cochran's theorem can be extended to decompositions in more than 2 spaces. From this theorem, we deduce: Proposition ...
  23. [23]
    [PDF] A fast and consistent variable selection method for high-dimensional ...
    Jan 6, 2019 · However, in high-dimensional data contexts ... Wishart distribution and Cochran's Theorem, we can state that Wj and W are independent,.
  24. [24]
    [PDF] Can We Trust the Bootstrap in High-dimensions? The Case of ...
    Abstract. We consider the performance of the bootstrap in high-dimensions for the setting of linear regression, where p<n but p/n is not close to zero.Missing: Cochran's | Show results with:Cochran's