Fact-checked by Grok 2 weeks ago

Cross-correlation

Cross-correlation is a fundamental concept in statistics and that measures the degree of similarity between two signals or as a function of the temporal displacement, or , between them. For two continuous-time signals f(t) and g(t), the cross-correlation is mathematically defined as (f \star g)(\tau) = \int_{-\infty}^{\infty} f^*(t) g(t + \tau) \, dt, where ^* denotes the and \tau represents the ; this formulation generalizes to discrete-time signals as \sum_n f^*(n) g(n + l) for lags l. Unlike , which assesses a signal's similarity to itself, cross-correlation evaluates relationships between distinct signals, providing insights into their statistical dependencies without assuming commutativity. In , cross-correlation serves as an estimator for processes, often computed via the for efficiency, yielding the cross-spectral density as \hat{R}_{xy}(\omega_k) = \frac{\overline{X(\omega_k)} Y(\omega_k)}{N}. It is essential for tasks such as detecting periodicities, aligning signals, and filtering noise, with applications extending to fields like where it helps identify echoes or delays. In statistics, particularly , the sample cross-correlation function () quantifies lagged linear associations between variables, aiding in model identification for processes like . Cross-correlation also plays a key role in for characterizing interactions between neural spike trains, such as computing spike-triggered averages to infer stimulus-response relationships, and in physics for analyzing correlations in experimental data. Its origins trace back to early 20th-century statistical theory, with significant advancements by in the 1940s through his work on stochastic processes and filtering, which formalized its use in . Modern implementations, such as MATLAB's xcorr function, facilitate its computation for practical research across disciplines.

Deterministic Signals

Definition

Cross-correlation measures the similarity between two deterministic signals as a function of the displacement of one relative to the other. For continuous-time complex-valued signals f(t) and g(t), the cross-correlation function is defined mathematically as (f \star g)(\tau) = \int_{-\infty}^{\infty} f^*(t) \, g(t + \tau) \, dt, where f^*(t) denotes the complex conjugate of f(t). This formulation generalizes the case for real-valued signals by omitting the conjugate. The corresponding definition for discrete-time signals f and g is (f \star g) = \sum_{m=-\infty}^{\infty} f^* \, g[m + n]. These expressions compute the integral or sum over all time, yielding the degree of overlap or alignment between the signals at lag \tau or n. Intuitively, the cross-correlation arises from the inner product in the space of signals, where one signal is shifted relative to the other; the magnitude indicates similarity, with the lag corresponding to the shift that maximizes alignment. Cross-correlation originated in in the early for in time series analysis. As a special case, when f = g, the cross-correlation reduces to the function.

Properties

The cross-correlation operation for deterministic signals exhibits in each argument. Specifically, for scalar constants a and b, and signals f_1, f_2, and g, it holds that (a f_1 + b f_2) \star g = a (f_1 \star g) + b (f_2 \star g), and analogously for the second argument: f \star (a g_1 + b g_2) = a (f \star g_1) + b (f \star g_2). This follows from the linearity of the underlying or defining the cross-correlation. For complex-valued signals, the cross-correlation satisfies conjugation symmetry: (f \star g)(\tau) = [(g \star f)(-\tau)]^*, where ^* denotes the complex conjugate. This property arises from the definition of cross-correlation as an inner product-like measure and ensures consistency in measuring similarity between complex signals. The cross-correlation is closely related to the Fourier transform. If \mathcal{F}\{f\} and \mathcal{F}\{g\} denote the Fourier transforms of f and g, respectively, then the Fourier transform of the cross-correlation is given by \mathcal{F}\{f \star g\} = \mathcal{F}\{f\}^* \cdot \mathcal{F}\{g\}, where \cdot indicates pointwise multiplication. Consequently, the cross-correlation itself can be computed as the inverse Fourier transform: (f \star g = \mathcal{F}^{-1} {\mathcal{F}{f}^* \cdot \mathcal{F}{g}}$. This frequency-domain representation facilitates efficient computation via the fast Fourier transform algorithm. A shift theorem applies to the cross-correlation. If one signal is a time-shifted version of the other, say g(t) = f(t - T), then the cross-correlation (f \star g)(\tau) achieves its maximum value at \tau = T, with the shape of the shifting accordingly to reflect the delay. In general, time shifts in the input signals translate directly to shifts in the output cross-correlation function. For identical signals, f = g, the cross-correlation (f \star f)(\tau) reaches its maximum at \tau = 0, where the value equals the energy of the signal, \int |f(t)|^2 dt, quantifying the maximum overlap between the signal and itself. This at zero serves as a measure of the signal's and total content.

Random Variables and Vectors

Scalar Random Variables

The cross-correlation between two scalar random variables X and Y (possibly complex-valued) is defined as \rho_{XY} = E[XY^*], where E[\cdot] denotes the statistical expectation and ^* the . This quantity represents the of the product XY^*, serving as a fundamental measure of their joint statistical behavior in and . In contrast to cross-covariance, which centers the variables by subtracting their means and is given by E[(X - \mu_X)(Y - \mu_Y)^*] with \mu_X = E[X] and \mu_Y = E[Y], the cross-correlation \rho_{XY} incorporates these means directly, yielding \rho_{XY} = E[(X - \mu_X)(Y - \mu_Y)^*] + \mu_X \mu_Y^*. This inclusion makes cross-correlation sensitive to both linear relationships and location shifts in the distributions. For jointly Gaussian random variables, \rho_{XY} can be computed explicitly via the joint (PDF). The bivariate Gaussian joint PDF is f_{XY}(x,y) = \frac{1}{2\pi \sigma_X \sigma_Y \sqrt{1 - \gamma^2}} \exp\left( -\frac{1}{2(1 - \gamma^2)} \left[ \frac{(x - \mu_X)^2}{\sigma_X^2} + \frac{(y - \mu_Y)^2}{\sigma_Y^2} - \frac{2\gamma (x - \mu_X)(y - \mu_Y)}{\sigma_X \sigma_Y} \right] \right), where \sigma_X^2 = \mathrm{Var}(X), \sigma_Y^2 = \mathrm{Var}(Y), and \gamma is the . Integrating gives \rho_{XY} = \iint_{-\infty}^{\infty} xy \, f_{XY}(x,y) \, dx \, dy = \mu_X \mu_Y + \gamma \sigma_X \sigma_Y. The normalized cross-correlation coefficient |\rho_{XY}| = |E[XY^*]| / \sqrt{E[|X|^2] E[|Y|^2]} satisfies |\rho_{XY}| \leq 1 by the Cauchy-Schwarz inequality, with equality holding when X and Y are linearly dependent. This bound quantifies the maximum possible linear association. Cross-correlation thus measures linear dependence, offering insights into relationships that extend beyond tests for statistical , as uncorrelated variables (zero cross-correlation after centering) may still exhibit nonlinear dependencies.

Random Vectors

The cross-correlation matrix between two random column vectors of random variables \mathbf{X} and \mathbf{Y} is defined as \mathbf{R}_{XY} = E[\mathbf{X} \mathbf{Y}^H], where E[\cdot] denotes the expectation operator and ^H indicates the Hermitian (conjugate) transpose. This matrix encapsulates the second-order statistical relationships across all components of the vectors, enabling multivariate analysis of dependencies in fields such as and communications. For the auto-correlation matrix \mathbf{R}_{XX}, unique properties arise from its vector structure: it is Hermitian (\mathbf{R}_{XX}^H = \mathbf{R}_{XX}) and positive semi-definite, satisfying \mathbf{a}^H \mathbf{R}_{XX} \mathbf{a} \geq 0 for any complex \mathbf{a}, with equality only if \mathbf{a} lies in the null space. The trace of \mathbf{R}_{XX} provides the expected squared Euclidean norm of the vector, \operatorname{tr}(\mathbf{R}_{XX}) = E[\|\mathbf{X}\|^2], which quantifies the total second-moment energy. In the bivariate case, where \mathbf{X} = \begin{pmatrix} X_1 \\ X_2 \end{pmatrix} and \mathbf{Y} = \begin{pmatrix} Y_1 \\ Y_2 \end{pmatrix}, the cross-correlation matrix reduces to scalar blocks: its (i,j)-th element is E[X_i Y_j^*], linking directly to scalar cross-correlations. For example, consider jointly random variables X and Y on the triangular region $0 < x < y < 1 with joint density f_{XY}(x,y) = 2, forming a bivariate vector \mathbf{Z} = \begin{pmatrix} X \\ Y \end{pmatrix}; the means are \mu_X = 1/3 and \mu_Y = 2/3, yielding the auto-correlation matrix \mathbf{R}_{ZZ} = \begin{pmatrix} 1/6 & 1/4 \\ 1/4 & 1/2 \end{pmatrix}. The diagonal elements of \mathbf{R}_{XX} represent the auto-correlations E[|X_i|^2] of the individual components, while the off-diagonal elements E[X_i X_j^*] (for i \neq j) quantify the cross-dependencies between components. This relation to the cross-covariance matrix \mathbf{C}_{XY} = E[(\mathbf{X} - \boldsymbol{\mu}_X)(\mathbf{Y} - \boldsymbol{\mu}_Y)^H] is derived by expanding the centered terms: \mathbf{R}_{XY} = \mathbf{C}_{XY} + \boldsymbol{\mu}_X \boldsymbol{\mu}_Y^H, where \boldsymbol{\mu}_X = E[\mathbf{X}] and \boldsymbol{\mu}_Y = E[\mathbf{Y}].

Complex Case

For complex-valued random vectors \mathbf{X} and \mathbf{Y}, the cross-correlation matrix is defined as \mathbf{R}_{XY} = E[\mathbf{X} \mathbf{Y}^H], where ^H denotes the (conjugate transpose), adjusting the standard real-valued form to account for the complex nature of the vectors. This expectation operator captures the second-order statistical relationship between the vectors, assuming zero mean for simplicity; for non-zero means, the centered version E[(\mathbf{X} - E[\mathbf{X}]) (\mathbf{Y} - E[\mathbf{Y}])^H] yields the . A fundamental property of this cross-correlation matrix is Hermitian symmetry, given by \mathbf{R}_{XY}^H = \mathbf{R}_{YX}, which follows directly from the definition and ensures that the diagonal elements (auto-correlations) are real-valued, representing power. This symmetry distinguishes the complex case from real-valued vectors, where simple transposition suffices, and guarantees that \mathbf{R}_{XX} is Hermitian positive semi-definite, meaning all eigenvalues are non-negative, a crucial condition for valid covariance structures in probabilistic models. In the context of circularly symmetric complex Gaussian random vectors, the cross-correlation simplifies notably, as the pseudo-covariance E[\mathbf{X} \mathbf{Y}^T] (without conjugation) vanishes, implying no correlation between the vector and its conjugate. For instance, in a zero-mean circularly symmetric complex Gaussian vector \mathbf{X} \sim \mathcal{CN}(\mathbf{0}, \mathbf{Q}), the covariance \mathbf{Q} = E[\mathbf{X} \mathbf{X}^H] is Hermitian with a block structure in its real-imaginary representation that enforces rotational invariance, making \mathbf{R}_{XY} solely dependent on the Hermitian form. This formulation finds essential use in communications engineering, particularly for signal detection in complex baseband models of wireless channels, where cross-correlations between transmitted and received vectors model fading effects under additive Gaussian noise. In multiple-input multiple-output (MIMO) systems, for example, the channel matrix \mathbf{H} with entries as circularly symmetric complex Gaussians relies on E[\mathbf{h} \mathbf{h}^H] to derive detection metrics like maximum likelihood estimators.

Stochastic Processes

Cross-Correlation Function

The cross-correlation function for two stochastic processes X(t) and Y(t) is defined as R_{XY}(t_1, t_2) = \mathbb{E}[X(t_1) Y(t_2)^*], where \mathbb{E}[\cdot] denotes the expectation operator and ^* indicates the complex conjugate, applicable to both real and complex-valued processes. This formulation captures the expected value of the product of the processes at arbitrary times t_1 and t_2, without assuming stationarity, and serves as a measure of their linear dependence across different time points. For jointly wide-sense stationary processes, where the means are constant and the correlation depends only on the time difference, the cross-correlation function simplifies to R_{XY}(\tau) = \mathbb{E}[X(t) Y(t + \tau)^*], with \tau = t_2 - t_1. At zero lag (\tau = 0), R_{XY}(0) reduces to the instantaneous correlation between X(t) and Y(t), providing insight into their contemporaneous linear relationship. This concept was formalized in the context of optimal filtering by during the 1930s and 1940s, particularly in his development of methods for signal prediction and noise reduction using correlation-based techniques. For example, consider two dependent Poisson processes A(t) and B(t) with intensity functions \lambda_A and \lambda_B, where events in A trigger events in B after a delay; their cross-correlation R_{AB}(\tau) can be computed as \lambda_A \lambda_B + \lambda_A h(\tau), where h(\tau) models the triggering response for \tau > 0, illustrating delayed dependence in point processes. In autoregressive moving average (ARMA) models, the cross-correlation between an input process X(n) and output Y(n) of an AR(1) filter follows the recurrence R_{XY} = \sigma_X^2 \delta + a R_{XY}[m-1], where a is the AR coefficient and \delta is the , demonstrating how past correlations propagate through the .

Cross-Covariance Function

The function for two processes X(t) and Y(t) is defined as C_{XY}(t_1, t_2) = \mathbb{E}\left[ \left( X(t_1) - \mu_X(t_1) \right) \left( Y(t_2) - \mu_Y(t_2) \right)^* \right], where \mu_X(t_1) = \mathbb{E}[X(t_1)], \mu_Y(t_2) = \mathbb{E}[Y(t_2)], and ^* denotes the , applicable to complex-valued processes. This measure quantifies the centered linear dependence between the processes at times t_1 and t_2, removing the influence of their means to focus on deviations. For real-valued processes, the conjugate is omitted. For jointly wide-sense processes, where means are and the depends only on the time \tau = t_2 - t_1, the simplifies to C_{XY}(\tau) = R_{XY}(\tau) - \mu_X \mu_Y^*, with R_{XY}(\tau) denoting the cross-correlation . In the special case where X = Y, the reduces to the auto-covariance along the diagonal, measuring the process's own variability. The cross-covariance vanishes if the processes are uncorrelated at the respective time points, meaning \mathbb{E}[(X(t_1) - \mu_X(t_1))(Y(t_2) - \mu_Y(t_2))^*] = 0. For example, in processes, the is zero for all \tau \neq 0, reflecting their uncorrelated nature across non-simultaneous times. This property is exploited in whitening filters, where estimates help decorrelate multivariate signals by inverting the structure.

Wide-Sense Stationary Processes

Wide-sense stationary (WSS) processes represent a class of stochastic processes where the statistical properties relevant to second-order analysis remain invariant under time shifts. Specifically, a stochastic process X(t) is wide-sense stationary if its mean \mu_X(t) = E[X(t)] is constant for all t, and its autocorrelation function R_{XX}(t_1, t_2) = E[X(t_1) X(t_2)^*] depends solely on the time difference \tau = t_2 - t_1, denoted as R_{XX}(\tau). This stationarity condition extends to pairs of processes: two processes X(t) and Y(t) are jointly wide-sense stationary if each is individually WSS and their cross-correlation function R_{XY}(t_1, t_2) = E[X(t_1) Y(t_2)^*] also depends only on \tau = t_2 - t_1. Under these conditions, the cross-correlation simplifies to a function of the lag \tau alone, facilitating analysis in applications such as signal detection and filtering. For jointly WSS processes X(t) and Y(t), the cross-correlation function is defined as R_{XY}(\tau) = E\left[X(t) Y(t + \tau)^*\right], where the expectation holds for all t, and the asterisk denotes the to accommodate complex-valued processes common in communications and . This formulation captures the average similarity between X(t) and the time-shifted, conjugated version of Y(t). The cross-covariance function, which removes the influence of the means, is then given by C_{XY}(\tau) = R_{XY}(\tau) - \mu_X \mu_Y^*, where \mu_X = E[X(t)] and \mu_Y = E[Y(t)] are the constant means. For zero-mean processes (\mu_X = \mu_Y = 0), the cross-correlation and cross-covariance coincide. The cross-correlation function for WSS processes exhibits several key properties derived from and inequality principles. By the Cauchy-Schwarz inequality applied to the inner product defined by , \left| R_{XY}(\tau) \right| \leq \sqrt{R_{XX}(0) R_{YY}(0)} for all \tau, where R_{XX}(0) and R_{YY}(0) are the average powers of X(t) and Y(t), respectively; equality holds when X(t) and Y(t) are linearly related. These bounds ensure that the cross-correlation cannot exceed the of the individual powers, providing a measure of normalized dependence. A representative example involves sinusoidal signals embedded in noise, illustrating the lag-dependent nature of cross-correlation under WSS assumptions. Consider two jointly WSS processes X(t) = A \cos(\omega t + \phi) and Y(t) = A \cos(\omega t + \phi + \theta), where \phi and \theta are random phases uniformly distributed over [0, 2\pi), yielding zero mean and WSS properties; if \theta = 0 for matched frequencies, the cross-correlation simplifies to R_{XY}(\tau) = \frac{A^2}{2} \cos(\omega \tau). Adding independent zero-mean white noise to each process preserves the WSS nature and modifies the cross-correlation by convolving with the noise autocorrelation, but the sinusoidal component retains the \cos(\omega \tau) form at low noise levels, highlighting phase alignment via lag.

Normalization and Applications

Normalization Techniques

Normalization techniques for cross-correlation functions address the issue of scale dependency in raw cross-correlation measures, enabling fair comparisons between signals or processes with differing amplitudes or energies. These methods produce scale-invariant metrics, often bounded within [-1, 1], where values near 1 indicate strong similarity, -1 indicates strong anti-similarity, and 0 suggests no linear relationship. One common approach is the normalized cross-correlation (NCC), particularly suited for wide-sense stationary stochastic processes assuming zero mean. It is defined as \rho_{XY}(\tau) = \frac{R_{XY}(\tau)}{\sqrt{R_{XX}(0) R_{YY}(0)}} where R_{XY}(\tau) is the cross-correlation function, and R_{XX}(0) and R_{YY}(0) represent the autocorrelations at zero lag, corresponding to the powers of processes X and Y. This normalization ensures |\rho_{XY}(\tau)| \leq 1, a bound derived from the Cauchy-Schwarz inequality applied to the inner product in the of the processes. The NCC can be interpreted as the between the processes at lag \tau, measuring the angle between their shifted versions in the L^2 space. For a perfect match at some lag, the NCC peaks at 1. For scenarios involving non-zero means or deterministic signals with potential offsets, such as in image processing or , the zero-normalized cross-correlation (ZNCC) is preferred for its robustness to constant biases. The ZNCC is given by \frac{R_{XY}(\tau) - \mu_X \mu_Y}{\sigma_X \sigma_Y}, where \mu_X and \mu_Y are the means of the respective signals or processes, and \sigma_X and \sigma_Y are their standard deviations. This form subtracts the mean contributions before , making it to linear shifts in or DC offsets while preserving the bounded range [-1, 1]. In practice, ZNCC is widely applied in , where it achieves a peak of 1 for identical templates despite variations in brightness.

Time Delay Estimation

Time delay estimation (TDE) is a fundamental application of cross-correlation in , where the goal is to determine the temporal shift between two related signals, such as a transmitted signal and its delayed replica received after through a medium. The standard method computes the cross-correlation function R_{XY}(\tau) between the two signals x(t) and y(t), and the delay estimate \hat{\tau} is obtained as the lag that maximizes its absolute value: \hat{\tau} = \arg\max_{\tau} |R_{XY}(\tau)|. This approach leverages the property that the cross-correlation peaks at the true delay under ideal conditions, assuming the signals are linearly related by a time shift. For enhanced robustness, particularly when signal energies vary, the normalized cross-correlation (NCC) can be used instead, where the estimate is \hat{\tau} = \arg\max_{\tau} \text{NCC}(\tau), with \text{NCC}(\tau) = \frac{R_{XY}(\tau)}{\sqrt{R_{XX}(0) R_{YY}(0)}}. In noisy environments, the presence of additive noise can introduce bias and sidelobes in the correlation peak, degrading the accuracy of the delay estimate. To mitigate this, the zero-normalized cross-correlation (ZNCC) is applied, which first subtracts the mean from each signal segment and then normalizes by the standard deviation, yielding \text{ZNCC}(\tau) = \frac{\sum (x_i - \bar{x})(y_{i+\tau} - \bar{y})}{\sqrt{\sum (x_i - \bar{x})^2 \sum (y_{i+\tau} - \bar{y})^2}}. This preprocessing reduces the impact of noise correlations and offsets, providing a more reliable peak location for TDE. The ZNCC is particularly effective in scenarios with low signal-to-noise ratios (SNRs), as it normalizes out slow-varying noise components while preserving the sharp correlation peak associated with the true delay. A prominent example of TDE via cross-correlation is in radar systems for echo location. Here, the transmitted waveform is correlated with the received echo signal; the position of the correlation peak directly corresponds to the round-trip propagation delay, which is proportional to the target's range via \tau = \frac{2r}{c}, where r is the range and c is the speed of light. This method enables precise target detection and ranging, even in cluttered environments, by exploiting the high correlation between the reference and delayed signals. For wide-sense stationary (WSS) processes, the cross-correlation-based TDE is an unbiased when the SNR exceeds zero, meaning the of \hat{\tau} equals the true delay \tau. Furthermore, in the high-SNR regime, the variance of this approaches the Cramér-Rao bound (CRB), given by \text{var}(\hat{\tau}) \geq \frac{1}{2\pi^2 \int f^2 |S(f)|^2 df / N_0}, where S(f) is the signal power , N_0 is the noise power , and the integral captures the effective ; this bound establishes the fundamental limit on estimation precision for unbiased estimators under assumptions. Despite its efficacy, cross-correlation TDE suffers from ambiguities when applied to periodic signals, as the exhibits multiple equally spaced peaks corresponding to each period, complicating the identification of the correct delay without additional constraints like prior knowledge of the signal period or detection.

Nonlinear Systems

In nonlinear systems, traditional cross-correlation techniques, which rely on second-order statistics, fail to capture dependencies arising from nonlinear interactions, as they assume linear relationships and Gaussian processes. This limitation necessitates the use of higher-order spectra, such as the , to detect quadratic coupling and nonlinear distortions that second-order methods overlook. For instance, in systems with non-Gaussian inputs or loops, cross-correlation may only reveal linear components, while bispectral analysis identifies the full extent of nonlinearity by preserving information. To address these challenges, modified approaches incorporate expansions, where cross-correlation is applied to estimate higher-order kernels that approximate nonlinear dynamics. In the Lee-Schetzen method, the first-order kernel is obtained via standard cross-correlation between input and output, while second- and higher-order kernels use cross-correlations with products of delayed inputs, enabling identification of nonlinearities up to a desired order. This technique has been refined to reduce estimation bias in noisy environments, improving accuracy for systems like bilinear models. In , cross-correlation of neural spike trains provides insights into nonlinear interactions, such as those mediated by synaptic nonlinearities or shared nonlinear inputs, by revealing peaks in correlograms that deviate from linear expectations. For nonlinearities specifically, second-order cross-correlation exhibits signatures of harmonics, as the quadratic term generates second-harmonic components in the output that correlate with the input at doubled frequencies. A post-2000 development integrates cross-correlation with data tests in to detect nonlinear coupling between systems, where preserve linear correlations but randomize nonlinear phases, allowing statistical rejection of independence under nonlinear dynamics. This approach has been applied to bivariate from oscillators, confirming nonlinear interdependence when cross-correlations persist beyond surrogate null distributions.

References

  1. [1]
    Cross-Correlation - Stanford CCRMA
    The term ``cross-correlation'' comes from statistics, and what we have defined here is more properly called a ``sample cross-correlation.''
  2. [2]
    [PDF] Convolution and correlation
    The correlation is used to characterize the statistical dependencies between two signals. A few words about the big picture. The previous lecture discussed how ...
  3. [3]
    Time domain measures of inter-channel EEG correlations
    The use of autocorrelation and cross-correlations to study electroencephalograms is reported to have been suggested by Norbert Wiener in 1949 to a group of ...
  4. [4]
    [PDF] Fourier analysis
    • Definition. – The inner-product operator allows us to define the cross-correlation between two continuous signals x(t) and y(t) as: Rxy. 𝜏 = x k ...
  5. [5]
    Cross-Correlation: An fMRI Signal-Processing Strategy - PMC
    Oct 28, 2011 · Reference waveforms could be cross-correlated with pixel time courses to form an array of cross-correlation coefficients. From this array of ...
  6. [6]
    [PDF] signals & systems
    ... OPPENHEIM, Eos. Advanced Topics in Signal Processing. MARPLE Digital Spectral ... Properties of Discrete-Time Complex Exponentials 25. 1.4. The Unit ...
  7. [7]
    Cross-Correlation -- from Wolfram MathWorld
    The cross-correlation of two complex functions f(t) and g(t) of a real variable t, denoted f*g is defined by f*g=f^_(-t)*g(t), where * denotes convolution
  8. [8]
    [PDF] 2.161 Signal Processing: Continuous and Discrete
    • Oppenheim, Schafer, and Buck: Stearns and Hush: Ch. 13. •. 1 The ... xcorr(f,g) computes the cross-correlation function, but reverses the definition of.
  9. [9]
    None
    ### Summary of Cross-Correlation for Scalar Random Variables
  10. [10]
    [PDF] Chapter 7 Random Processes
    that the variable is being correlated with itself rather than another variable, which is called cross-correlation. Page 8. 130. CHAPTER 7. RANDOM PROCESSES.
  11. [11]
    [PDF] 2.3. The Gaussian Distribution
    We illustrate the idea of conditional and marginal distributions associated with a multivariate Gaussian using an example involving two variables in Figure 2.9.
  12. [12]
    Cauchy-Schwarz Inequality - Probability Course
    You can prove the Cauchy-Schwarz inequality with the same methods that we used to prove |ρ(X,Y)|≤1 in Section 5.3.1. Here we provide another proof.Missing: cross- | Show results with:cross-
  13. [13]
    [PDF] Probability, Random Variables and Signals, and Classical ...
    Jun 28, 2021 · ... complex random vectors, it is necessary to extend this discussion by ... Cross-correlation matrix: RXY = E X(ζ)YH(ζ) = E X(ζ)XH(ζ)AH.
  14. [14]
    [PDF] Notes for ECE 534 An Exploration of Random Processes for Engineers
    Dec 21, 2012 · ... cross correlation matrix of X and Y is the m × n matrix E[XY T ] ... complex random vectors. X and Y is given by E[XY ∗], and similarly ...
  15. [15]
    [PDF] Topic 1. Complex Random Vector and Circularly Symmetric ...
    Jan 17, 2020 · 2) cross-correlation equals to the Kronecker product of corresponding transmit and receive correlations,. RH = Rr,H ⊗ Rt,H. In this case, we ...
  16. [16]
    [PDF] Complex Random Variables and Stochastic Processes - DSP-Book
    The term in popular use now for this kind of computation is statistical signal processing, and much of this Handbook is devoted to this very subject.
  17. [17]
    [PDF] Module 2: Stochastic Processes
    The "Cross Correlation" function of Xt and Yt is given by. Rx, y (s, t) = E [Xs Y₂]. -The "Cross Covariance" function is given by. Cx, y (s, t) = Cov (Xss ...
  18. [18]
    [PDF] ECE 302: Lecture 10.9 Cross-correlation through LTI Systems
    Definition. Two random processes X(t) and Y (t) are jointly WSS if. 1. X(t) is WSS and Y (t) is WSS. 2. RX,Y (t1,t2) = E[X(t1)Y (t2)] is a function of t1 − t2.
  19. [19]
    [PDF] Signals, Systems and Inference, Chapter 9: Random Processes
    Cross-correlation: RXY (ti,tj ) = E[X(ti)Y (tj )], and. (9.5). Cross ... where Rhh[m] is the deterministic autocorrelation function of h[m], defined as.Missing: mathematical formula
  20. [20]
    [PDF] Generalized Harmonic Analysis and Gabor and Wavelet Systems
    In 1930, Norbert Wiener [30a, Volume II, pages 183-324] proved an analogue of the Parseval-Plancherel formula, ||ƒ||L2(R) = ||||12(), for a large class of ...
  21. [21]
    [PDF] The Correlation Function of Multiple Dependent Poisson Processes ...
    Mar 14, 2007 · Expressing the cross-correlation function as a function of τ only means that the processes are jointly stationary. As an example of how the ...
  22. [22]
  23. [23]
    [PDF] Lecture 2: Measures of Dependence and Stationarity
    • The cross-covariance function associated with two time series xt, t = 1, 2, 3,... and yt, t = 1, 2, 3,... is defined as γxy(s, t) = Cov(xs,yt). This is not ...
  24. [24]
    [PDF] Introduction to Time Series Analysis. Lecture 23.
    The cross-covariance function of two jointly stationary processes {Xt} and. {Yt} is γxy(h) = E[(Xt+h − µx)(Yt − µy)]. (Jointly stationary = constant means ...Missing: formula | Show results with:formula
  25. [25]
    [PDF] MMSE Whitening and Subspace Whitening
    The cross-covariance of random variables a and b is denoted by cov(a, b), and E(·) denotes the expectation. Let a ∈ Rm denote a zero-mean. 1 random vector ...
  26. [26]
    [PDF] II Stationarity: A random process is said to be st
    Mar 18, 2003 · Some Properties of the Autocorrelation Function of a WSS Process: 1) RX(τ) is an even function of τ. 2) RX(0) ≥ 0. (Since the secont ...
  27. [27]
    [PDF] Stationary Random Processes - Wikimedia Commons
    Sep 2, 2021 · With complex-valued functions f and g, taking the conjugate of f ensures ... the cross-correlation of a pair of random processes is the ...
  28. [28]
    xcorr - Cross-correlation - MATLAB - MathWorks
    Cross-correlation measures the similarity between a vector x and shifted (lagged) copies of a vector y as a function of the lag.Corrcoef · Xcorr · Correlazione incrociata
  29. [29]
    [PDF] Random Variables and Stochastic Processes - UTK-EECS
    A random variable is a number assigned to every outcome of an experiment. X ζ( ). A stochastic process is the assignment of a function of t to each outcome of ...
  30. [30]
    [PDF] Fast Normalized Cross-Correlation - JP Lewis
    Fast normalized cross-correlation is a method for feature matching, computed in the spatial domain, and can be obtained from transform domain computation, ...
  31. [31]
  32. [32]
    Time delay estimation issues for target detection and transmitter ...
    Jul 19, 2020 · Time delay estimation (TDE) is the problem of estimating the time delay between two received signals which have originated from same transmitter.REVIEW OF TIME DELAY... · TDE ISSUES OF... · ANALYSIS OF DELAY...
  33. [33]
    [PDF] On the Cramer Rao bound and maximum likelihood in passive time ...
    Dec 16, 2020 · ABSTRACT. This paper is devoted to time delay estimation for wide sense stationary complex circular or noncircular Gaussian signals.Missing: unbiased | Show results with:unbiased
  34. [34]
    The Periodic Ambiguity Function - Its Validity and Value
    Aug 7, 2025 · PAF can handle periodic signals with infinite (or large) number of periods, and a processor that is matched to fewer periods.
  35. [35]
    [PDF] Signal processing with higher-order spectra - ISY
    as identify nonlinear systems [Nikias and Raghuveer, 1987]. ... Figure 2 illustrates time-delay parameter estimation results obtained by a cross-correlation (i.e. ...
  36. [36]
    Some basic aspects and uses of higher-order spectra - ScienceDirect
    Emphasis is placed on the use of these spectra with non-Gaussian processes in the identification of nonlinear systems. ... Cross-correlation analysis of neural ...
  37. [37]
    [PDF] Tutorial on higher-order statistics (spectra) in signal processing and ...
    ... order to be able to compute the cross-correlation r₁₂(k). In many appli ... radar, nonlinear systems, harmonic retrieval and detection, image.
  38. [38]
    [PDF] Measurement of the Wiener Kernels of a Non-linear System by ...
    Measurement of the Wiener Kernels of a Non-linear System by Cross-correlation† · Y. W. Lee, M. Schetzen · Published 1 September 1965 · Engineering, Physics ...
  39. [39]
    Improving the approximation ability of Volterra series identified with ...
    Sep 16, 2014 · This paper proposes an improvement in cross-correlation methods derived from the Lee–Schetzen method, in order to obtain a lower mean square ...
  40. [40]
    Nonlinear Transfer of Signal and Noise Correlations in Cortical ...
    May 27, 2015 · Here we show that spike threshold transforms correlations by creating nonlinear interactions between signal and noise inputs; even when ...<|separator|>
  41. [41]
    Nonlinear Ultrasonic Measurements Based on Cross-Correlation ...
    As in the case of the classic harmonic generation, the nonlinearity parameters of the second and third harmonics indicate a strong correlation with fatigue ...
  42. [42]
    Testing dynamic correlations and nonlinearity in bivariate time ...
    In this work, we present various surrogate methods designed to assess specific statistical properties in random processes.
  43. [43]
    Comparison of Cross-Correlation and Joint-Recurrence ... - Frontiers
    Feb 17, 2020 · We conclude that TDS using cross-correlation provides results that are comparable to those obtained with JRQA at a much-reduced computational cost.