Fact-checked by Grok 2 weeks ago

Cox process

A Cox process, also known as a doubly stochastic Poisson process, is a type of in and where the intensity measure is itself a random () process, such that conditional on this random intensity, the process follows an inhomogeneous distribution. This construction generalizes the standard process by incorporating variability in the event rate, leading to phenomena like clustering and , where the variance of the number of events exceeds the mean. Named after the Sir David R. Cox, who introduced the concept in his seminal 1955 paper on statistical methods for series of events occurring haphazardly in time or space, the Cox process provides a flexible framework for modeling dependent point patterns that cannot be captured by fixed-intensity models. Mathematically, for a Cox process X on \mathbb{R}^d, given a non-negative random function \Lambda(u), X is Poisson with \Lambda, and the unconditional is \rho(u) = \mathbb{E}[\Lambda(u)], while higher-order properties like the pair g(u,v) = \mathbb{E}[\Lambda(u)\Lambda(v)] / [\rho(u)\rho(v)] \geq 1 reflect positive dependence and aggregation. Common subclasses include the log Gaussian Cox process (LGCP), where \log \Lambda(u) is Gaussian, yielding tractable product densities such as \log \rho(u) = \xi(u) + c(u,u)/2 and g(u,v) = \exp(C(u,v)); the shot-noise Cox process (SNCP), driven by a Poisson cluster-like \Lambda(u) = \sum \gamma k(c,u); and the permanental Cox process, based on squared Gaussian fields. These models are closed under operations like independent , random displacements, and superposition, facilitating and despite the challenges posed by the latent . Cox processes have found extensive applications across disciplines due to their ability to model irregular, clustered events influenced by unobserved heterogeneity. In spatial statistics, they describe aggregated point patterns in , (e.g., disease outbreaks), and (e.g., rainfall or occurrences). In and , they underpin claim count modeling, catastrophe pricing, and aggregate processes, as in shot-noise variants for dependent risks. Financial applications include simulating default events and jump-diffusion intensities for derivatives , while in , they generate neuronal spike trains. Advances in Bayesian computation, such as for LGCPs, have enhanced their inferential tractability for large datasets.

Background and Definition

Historical Development

The Cox process, originally termed the doubly Poisson process, was introduced by British statistician David R. Cox in his seminal 1955 paper, where he proposed it as a model for analyzing series of events with varying intensity. This work built on the foundational Poisson process, extending it to account for fluctuations in the rate parameter to better capture real-world variability in event occurrences. The development of the Cox process was motivated by challenges in mid-20th-century statistical modeling, particularly in —where arrival rates might fluctuate unpredictably—and processes, which generalize inter-event times beyond fixed assumptions. Cox's interest stemmed from practical problems, such as those in involving queue-like systems, prompting his exploration of stochastic processes with random intensities. By the , these ideas gained traction through Cox's monograph on , which further contextualized the process within broader stochastic frameworks. In the , the concept began evolving toward spatial applications, with and collaborator Valerie Isham extending the framework to multidimensional settings in their 1980 book on es, enabling modeling of irregularly distributed events in space. This marked a key milestone in adapting the process beyond temporal sequences to handle geographic clustering and inhomogeneity. Around the same decade, the term "Cox process" became standardized in point process literature, solidified by comprehensive treatments in works like Daley and Vere-Jones's 1988 introduction, which formalized its role in probabilistic modeling of random events.

Formal Definition

A point process is defined as a random counting measure \Phi on a measurable space S, where for each measurable set A \subseteq S, \Phi(A) represents the number of points in A, and \Phi is finite almost surely on bounded sets. This framework captures the distribution of randomly located points, such as events in time or space. A is a special case where, for disjoint measurable sets A_1, \dots, A_k, the counts \Phi(A_i) are independent and each follows a with mean given by a deterministic measure \mu(A_i). The Cox process, introduced by David R. Cox in his paper on statistical methods for series of events, generalizes the by allowing the measure to be random. Formally, a Cox process \Phi on a space S is a point process such that, conditional on a random positive measure \Lambda (the measure), \Phi is a with measure \Lambda. Unconditionally, the of \Phi is that of a mixed process, where the mixing is induced by \Lambda, and \Lambda is independent of the underlying sampling mechanism. In its general construction, the points of a Cox process are generated by first sampling a realization of the random measure \Lambda from its , and then sampling a with mean measure \Lambda.

Mathematical Framework

Random Intensity Measure

The random intensity measure \Lambda underlying a Cox process is a random element in the space of locally finite positive measures on the state space, such as \mathbb{R}^d for spatial processes or [0, \infty) for temporal ones. This measure governs the expected number of points in any region, with \Lambda(B) denoting the random expected count in a set B. For the process to be well-defined, \Lambda must satisfy \Lambda(B) < \infty almost surely for all bounded measurable sets B, ensuring the associated Poisson process has finite intensity on compact regions. In practice, \Lambda is frequently specified through a stochastic intensity function \lambda, where d\Lambda(x) = \lambda(x) \, dx, and \lambda follows a prescribed random process. Common realizations include Lévy processes for one-dimensional temporal Cox processes, which allow for jumps and introduce variability in event rates over time, or Gaussian random fields for multidimensional spatial settings, capturing smooth spatial heterogeneity. These choices enable \Lambda to model non-stationary or clustered phenomena while maintaining the core doubly stochastic structure. A key requirement for \Lambda is that it remains almost surely non-negative, preserving the positivity essential for intensity measures in Poisson processes. For instance, in simple temporal models over a fixed interval [0, T], \Lambda([0, T]) may follow a gamma distribution with shape and rate parameters chosen to ensure integrability and positivity, leading to overdispersed count distributions compared to . Such specifications guarantee that the overall process remains a valid point process with finite moments where needed. The generation of a Cox process proceeds by first independently sampling \Lambda from its underlying probability distribution, followed by conditioning on \Lambda to produce an inhomogeneous Poisson process with mean measure \Lambda. This two-stage mechanism, independent of any thinning or superposition steps, embeds randomness directly into the intensity, distinguishing Cox processes from homogeneous Poisson processes. Consequently, the variability in \Lambda induces positive dependence among points and spatial or temporal heterogeneity, enabling the modeling of real-world clustering not achievable with constant intensities.

Conditional Structure

When conditioned on a specific realization of the random intensity measure \Lambda = \lambda, where \lambda is a non-random positive measure on the space, a reduces to a non-homogeneous with intensity measure \lambda. In this setting, the points of the process are distributed according to the intensity \lambda, meaning that for any B, the number of points N(B) in B follows a with mean \int_B \lambda(ds). Key properties of the Cox process under this conditioning mirror those of a standard Poisson point process. The counts N(B) for disjoint Borel sets B_1, \dots, B_k are independent, reflecting the lack of dependence introduced by the fixed intensity. The factorial moments of N(B) are given by E[(N(B))_k \mid \Lambda = \lambda] = \left( \int_B \lambda(ds) \right)^k for k = 1, 2, \dots, which directly follow from the Poisson structure. Additionally, the void probability, or the probability of no points in B, is \mathbb{P}(N(B) = 0 \mid \Lambda = \lambda) = \exp\left( -\int_B \lambda(ds) \right). The conditional intensity function, which governs the expected rate of points at a location t, is \lambda(t \mid \Lambda) = \Lambda(\{t\}) in discrete time settings or, more generally, the Radon-Nikodym derivative of \Lambda with respect to the Lebesgue measure in continuous spaces, providing the deterministic rate once \Lambda is realized. This contrasts with the unconditional , where the randomness in \Lambda induces dependence among points and over-dispersion relative to a ; conditioning eliminates this variability, yielding the familiar independent increments and exact Poisson marginals characteristic of non-homogeneous .

Key Properties

Transforms and Functionals

The characteristic functional of a Cox process \Phi, defined as G(f) = \mathbb{E}\left[\exp\left(i \int f(s) \, d\Phi(s)\right)\right] for a bounded continuous test function f, is derived by conditioning on the directing random measure \Lambda. Given \Lambda, \Phi is a Poisson process with intensity measure \Lambda, so the conditional characteristic functional is \exp\left(-\int (1 - e^{if(s)}) \Lambda(ds)\right). Taking the expectation over \Lambda yields the unconditional form G(f) = \mathbb{E}\left[\exp\left(-\int (1 - e^{if(s)}) \Lambda(ds)\right)\right]. The Laplace transform of the intensity measure \Lambda is given by \mathbb{E}\left[\exp\left(-\int u(s) \Lambda(ds)\right)\right] for a nonnegative test function u \geq 0. This follows directly from the definition of the Laplace functional of the random measure \Lambda, which captures its distributional properties under exponential weighting. This transform serves as a building block for the Laplace functional of the itself, as the conditional structure links the two. The Laplace functional of the point process \Phi is L(u) = \mathbb{E}\left[ \exp\left( -\int (1 - e^{-u(s)}) d\Phi(s) \right) \right], which, by conditioning on \Lambda, simplifies to L(u) = \mathbb{E}\left[ \exp\left( -\int (1 - e^{-u(s)}) \Lambda(ds) \right) \right]. This expression arises because the conditional Laplace functional of a Poisson process with mean measure \Lambda is \exp\left( -\int (1 - e^{-u(s)}) \Lambda(ds) \right), and averaging over the distribution of \Lambda gives the unconditional version. These transforms are instrumental in simulation algorithms for Cox processes, where generating the directing measure \Lambda via its Laplace transform facilitates subsequent Poisson sampling, and in moment generation, as logarithmic derivatives of the functionals yield factorial moment measures of \Phi.

Moments and Densities

The first-moment measure of a Cox process \Phi, defined as \mathbb{E}[\Phi(A)] for a Borel set A, equals the expected value of the directing random intensity measure \Lambda(A), i.e., \mathbb{E}[\Phi(A)] = \mathbb{E}[\Lambda(A)]. This represents the expected number of points in A and serves as the intensity measure of the process, highlighting how the randomness in \Lambda propagates to the mean count. The second-moment measure, \mathbb{E}[\Phi(A) \Phi(B)], incorporates both the covariance of the random intensities and an intersection term: \mathbb{E}[\Phi(A) \Phi(B)] = \mathbb{E}[\Lambda(A) \Lambda(B)] + \mathbb{E}[\Lambda(A \cap B)]. This structure reveals additional variance beyond that of a homogeneous , arising from the variability in \Lambda, which induces dependence between counts in overlapping or nearby regions. For disjoint sets A and B, the expression simplifies to \mathbb{E}[\Lambda(A)] \mathbb{E}[\Lambda(B)] + \mathrm{Cov}(\Lambda(A), \Lambda(B)), emphasizing the extra clustering effect due to the stochastic intensity. Factorial moment densities provide a refined characterization of the joint distribution for distinct points. For a Cox process on \mathbb{R}^d, the k-th order factorial moment density \rho^{(k)}(x_1, \dots, x_k) is the density of the k-th factorial moment measure and equals \mathbb{E}[\lambda(x_1) \cdots \lambda(x_k)], where \lambda is the random intensity function of \Lambda. For the second order, this yields \rho^{(2)}(s,t) = \mathbb{E}[\lambda(s) \lambda(t)] + \delta(s-t) \mathbb{E}[\lambda(s)], where \delta is the Dirac delta function accounting for the diagonal contribution in the continuous space. These densities, derived by conditioning on \Lambda and leveraging the Poisson property, fully determine the unconditional point distribution and facilitate moment-based inference. The pair correlation function g(s,t) quantifies second-order dependence off the diagonal and is given by g(s,t) = \frac{\mathbb{E}[\lambda(s) \lambda(t)]}{\mathbb{E}[\lambda(s)] \mathbb{E}[\lambda(t)]}. For a , g(s,t) \geq 1 with equality only if \lambda(s) and \lambda(t) are uncorrelated, indicating inherent clustering from the random intensity; values greater than 1 signal aggregation, while the function integrates to reveal the overall variance inflation relative to a .

Specific Models

Log-Gaussian Cox Process

The log-Gaussian Cox process (LGCP) is a prominent class of Cox processes defined by specifying the random intensity measure \Lambda such that \log \Lambda is a Gaussian random field or process, resulting in a log-normal intensity function. This construction arises within the general framework of , where the intensity is stochastic, but the LGCP imposes a Gaussian structure on the logarithm to capture environmental heterogeneity driving point patterns. In the spatial domain over \mathbb{R}^d, the intensity function takes the form \log \lambda(x) = \mu(x) + Z(x), where \mu(x) is a known mean function and Z(x) is a zero-mean Gaussian random field with a specified covariance kernel, such as the for modeling smooth spatial dependence. The resulting intensity \lambda(x) = \exp(\mu(x) + Z(x)) is strictly positive and log-normally distributed, which inherently promotes clustering in the point pattern due to the multiplicative effect of the Gaussian fluctuations. A key property of the LGCP is its ability to model aggregation through the pair correlation function g(u) = \exp(\text{Cov}(Z(x), Z(x+u))), which exceeds 1 wherever the covariance is positive, indicating attraction between points at those scales, while approaching 1 at distances where correlations decay. This makes the LGCP particularly suited for representing overdispersed patterns without explicit pairwise interactions, distinguishing it from processes that enforce repulsion. Simulation of an LGCP proceeds in two steps: first, sample the latent Gaussian field Z using methods like Cholesky decomposition on a discretized grid or spectral approximations for efficiency; then, conditional on \lambda = \exp(\mu + Z), generate points from an inhomogeneous Poisson process with that intensity. For inference, the intractable likelihood due to the latent field is typically addressed via Markov chain Monte Carlo (MCMC) methods, such as Metropolis-Hastings, or variational approximations to enable scalable Bayesian estimation of parameters like the kernel hyperparameters.

Other Variants

Beyond the log-Gaussian Cox process, which serves as a benchmark for Gaussian-driven models, several non-Gaussian variants extend the framework by incorporating different random measures for the intensity process. The shot-noise Cox process constructs the random intensity measure \Lambda(A) as a sum \sum_{i} g(X_i, A), where \{X_i\} are points from a homogeneous or inhomogeneous on a space, and g is a non-negative kernel function that determines the contribution of each point to the intensity over set A. This model captures clustering through the superposition of influence kernels, often used to represent phenomena like earthquakes or neural spikes where intensity builds from discrete events. In gamma process-driven Cox processes, the intensity measure \Lambda follows a , a Lévy process with gamma-distributed increments that ensures completely monotone intensity functions. This construction yields marginal counts N(A) that follow a , providing overdispersion relative to Poisson while maintaining interpretability for applications in reliability or epidemiology. Lévy-driven Cox processes generalize further by constructing the random intensity measure as a kernel smoothing of a , which introduces jumps and heavy-tailed behaviors into the intensity. These models are particularly suited for non-stationary temporal dynamics, where the intensity evolves with abrupt changes modeled via the Lévy basis. The permanental Cox process is a Cox process driven by a permanental random measure, which is constructed using the permanent of a positive definite kernel matrix, analogous to determinantal processes but promoting stronger clustering due to the non-negative nature of permanents. Hybrid variants, such as marked Cox processes, combine the random intensity with independent marks attached to points, enabling multivariate extensions where marks represent additional covariates like types or sizes. This allows modeling of multi-species interactions or heterogeneous events within the same doubly stochastic framework.

Applications and Extensions

Spatial Statistics and Ecology

Cox processes provide a flexible framework for modeling spatial point patterns in ecology, where observed events exhibit clustering driven by unobserved environmental heterogeneity. In species distribution modeling, the random intensity measure captures dependencies on covariates such as soil quality or topography, enabling the analysis of tree locations in forested areas where spatial aggregation reflects microhabitat variations. For instance, spatial Cox processes have been applied to quantify variance components in the distribution of rain forest trees, decomposing sources of variation into environmental and demographic factors. Log-Gaussian Cox processes (LGCPs), a prominent variant, are commonly used as a tool for such analyses due to their ability to incorporate Gaussian random fields for the intensity. A key advantage of Cox processes over homogeneous Poisson processes lies in their capacity to account for over-dispersion, where the variance of point counts exceeds the mean, a frequent feature in ecological data arising from unmeasured heterogeneity. This is particularly relevant for modeling animal habitats or spatial patterns in epidemic outbreaks, where Poisson assumptions fail to capture extra variability from clustered events. Cox processes address this by conditioning on a stochastic intensity, allowing for realistic dependence structures without assuming independence. In ecological contexts, this overdispersion modeling improves fit for aggregated patterns influenced by stochastic environmental factors, such as in wildlife distributions. Spatial inference for Cox processes, especially LGCPs, often relies on likelihood-free methods like approximate Bayesian computation (ABC) for parameter estimation, which is valuable in complex models for disease mapping where exact likelihoods are intractable. These methods facilitate posterior inference by simulating data and matching summaries to observations, aiding applications in environmental epidemiology. For example, ABC has been adapted for Bayesian computation in LGCPs to handle spatial point data in health-related ecological studies. Case studies from the 1990s onward highlight practical applications in forestry inventories and invasive species management. In forestry, hierarchical model tree regeneration patterns in uneven-aged stands, linking parent tree locations to offspring distributions while accounting for spatial covariates like canopy cover. Similarly, multi-type have been used to analyze weed spread—often invasive in agricultural settings—on organic fields, predicting intensity surfaces for two weed species based on spatio-temporal data from experiments. Other examples include modeling of gorilla nesting sites to capture clustering in primate habitats. Recent extensions apply fully non-separable to model forest fires, improving predictions of spatio-temporal intensity patterns influenced by environmental covariates.

Temporal Processes and Finance

In finance, temporal Cox processes, also known as doubly stochastic Poisson processes on the time domain, are employed to model point events whose intensities vary randomly over time, capturing clustering and dependence not accounted for by homogeneous Poisson processes. These processes define the cumulative intensity as a stochastic integral \Lambda(t) = \int_0^t \lambda(s) \, ds, where \lambda(s) is a positive random intensity function, allowing for the modeling of irregular event arrivals influenced by unobserved factors such as market volatility or economic shocks. This framework is particularly valuable in financial applications where event timing exhibits overdispersion and temporal dependence, enabling more accurate pricing and risk assessment. A primary application is in credit risk modeling, where Cox processes represent default times \tau for bonds or portfolios, with the default intensity \lambda(t) depending on observable covariates like interest rates or ratings, introducing stochastic credit spreads. The survival probability is given by P(\tau > t) = E\left[\exp\left(-\int_0^t \lambda(X_s) \, ds\right)\right], where X_s denotes a state variable, facilitating the valuation of defaultable securities through risk-neutral expectations. This approach generalizes reduced-form models, accommodating correlations between default risk and market factors, as developed in the intensity-based framework. Seminal work by , Lando, and Turnbull (1997) integrated Cox processes into a model for credit spreads, while Lando (1998) extended it to pure Cox settings for pricing derivatives like default swaps. In and actuarial , Cox processes model temporal claim arrivals, where the \Lambda(t) incorporates jumps from events decaying exponentially, \Lambda(t) = \sum_{i=1}^{N(t)} Y_i e^{-\beta (t - T_i)}, with N(t) a process, T_i event times, and Y_i claim sizes. This structure captures mean-reverting and event-driven spikes, aiding in pricing and assessments. Dassios and Jang (2003) demonstrated its use for derivatives, deriving closed-form pricing via deterministic Markov processes, highlighting tractability for temporal aggregation. Temporal processes also underpin models, simulating non-homogeneous order flows in limit s, where arrivals of market orders, limit orders, and cancellations form multidimensional processes with intensities modulated by recent trades or . This allows for the of microstructure , such as and provision, by incorporating to reflect intraday clustering. For instance, Bacry et al. (2016) applied multidimensional processes to high-frequency data, showing improved fit for order book imbalances over Hawkes alternatives in certain regimes.

References

  1. [1]
    Cox Process - an overview | ScienceDirect Topics
    A Cox process is a “doubly stochastic” process formed as an inhomogeneous Poisson process with an intensity function coming from some stochastic mechanism.
  2. [2]
    [PDF] Properties of spatial Cox process models
    This paper focuses on the probabilistic properties of Cox processes which are important for statistical modelling and inference. Section 2 provides some.
  3. [3]
    A review on Poisson, Cox, Hawkes, shot-noise Poisson and ...
    Sep 9, 2020 · The applications of Cox process in insurance context can be found in Grandell ( 1991, 1997), Rolski et al.<|control11|><|separator|>
  4. [4]
    Bayesian Computation for Log-Gaussian Cox Processes - PubMed
    The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly ...
  5. [5]
    Some Statistical Methods Connected with Series of Events
    Some Statistical Methods Connected with Series of Events · D. Cox · Published 1 July 1955 · Mathematics · Journal of the royal statistical society series b- ...
  6. [6]
    [PDF] A Conversation with Sir David Cox Nancy Reid Statistical Science ...
    Mar 23, 2008 · Doubly stochastic Poisson processes, all sorts ... Cited Publications of D.R.Cox. 3 Some Statistical Methods Connected with Series of Events.
  7. [7]
    The Theory of Stochastic Processes. by D. R. Cox, H. D. Miller - jstor
    It is interesting to note that all these techniques were first developed in queueing theory, where except in simple cases, the underlying processes are non-Mar-.
  8. [8]
    Point Processes - 1st Edition - D.R. Cox - Routledge
    $$240.00 In stock Free deliveryThis book describes the properties of stochastic probabilistic models and develops the applied mathematics of stochastic point processes.Missing: extensions 1970s
  9. [9]
    An Introduction to the Theory of Point Processes - SpringerLink
    Point processes and random measures find wide applicability in telecommunications, earthquakes, image analysis, spatial point patterns, and stereology.
  10. [10]
  11. [11]
    Some Statistical Methods Connected with Series of Events - 1955
    Some Statistical Methods Connected with Series of Events. D. R. Cox,. D. R. Cox. Statistical Laboratory, University of Cambridge. Search for more papers by this ...
  12. [12]
    On doubly stochastic Poisson processes
    Oct 24, 2008 · The class of stationary point processes known as 'doubly stochastic Poisson processes' was introduced by Cox ... doubly stochastic Poisson process ...
  13. [13]
    Forecasting a class of doubly stochastic Poisson processes
    Aug 6, 2025 · This paper deals with the doubly stochastic Poisson process (DSPP) with mean a truncated Gaussian distribution at any instant of time.<|control11|><|separator|>
  14. [14]
    [PDF] Moment estimation methods for stationary spatial Cox processes
    Log-Gaussian Cox process is defined by the conditional intensity function Ψ(u) = exp {Z(u)}, where Z is a Gaussian random field on Rd. Since the distribution of ...
  15. [15]
    Log Gaussian Cox Processes - Møller - 1998 - Wiley Online Library
    Dec 26, 2001 · Log Gaussian Cox Processes. Jesper Møller,. Jesper Møller. Aalborg University,. Search for more papers by this author · Anne Randi Syversveen,.
  16. [16]
    [PDF] Spatial and Spatio-Temporal Log-Gaussian Cox Processes - arXiv
    Abstract. In this paper we first describe the class of log-Gaussian Cox processes (LGCPs) as models for spatial and spatio-temporal point pro- cess data.
  17. [17]
    [PDF] Quick inference for log Gaussian Cox processes with non-stationary ...
    Oct 9, 2024 · Finally, a LGCP possesses the following attractive properties, although these will not be exploited in this article: the reduced Palm ...
  18. [18]
    Shot Noise Cox Processes - jstor
    For example, consider a Poisson-gamma shot noise Cox process given by f, 1,r (y) = K exp(-ry)/y (cf. (5)). In case (I), Table 1 contains values of the upper ...
  19. [19]
    Statistics for Inhomogeneous Space-Time Shot-Noise Cox Processes
    Mar 1, 2013 · In the paper we introduce a flexible inhomogeneous space-time shot-noise Cox process model and derive a two-step estimation procedure for it.
  20. [20]
    Negative Binomial Distributions of Individuals and Spatio-Temporal ...
    Assuming gamma-distributed random effects for the intensity process leads to an approximate negative binomial process.
  21. [21]
    Decomposition of Variance for Spatial Cox Processes - PMC - NIH
    This paper introduces a general criterion for decomposition of variance for Cox processes. The criterion is applied to specific Cox process models.4.2 A Log Linear Model · 5 Models For G · 5.3 Normal Variance Mixtures
  22. [22]
    Spatial and Spatio-Temporal Log-Gaussian Cox Processes
    Abstract. In this paper we first describe the class of log-Gaussian Cox pro- cesses (LGCPs) as models for spatial and spatio-temporal point process data.
  23. [23]
    Cox Processes Associated with Spatial Copula Observed through ...
    Mar 3, 2021 · Cox processes, also called doubly stochastic Poisson processes, are used for describing phenomena for which overdispersion exists, as well ...Missing: definition | Show results with:definition
  24. [24]
    [PDF] Log Gaussian Cox processes - Arizona Math
    A Cox process is 'doubly stochastic' as it arises as an inhomogeneous Poisson process with a random intensity measure. The random intensity measure is often ...
  25. [25]
    [1701.00857] Bayesian Computation for Log-Gaussian Cox Processes
    Jan 3, 2017 · Different methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods.
  26. [26]
    Hierarchical log Gaussian Cox process for regeneration in uneven ...
    Aug 20, 2021 · We propose a hierarchical log Gaussian Cox process (LGCP) for point patterns, where a set of points affects another set of points but not vice versa.<|control11|><|separator|>
  27. [27]
    Space‐time Multi Type Log Gaussian Cox Processes with a View to ...
    Aug 6, 2025 · Modeling spatial patterns and species associations in a Hyrcanian forest using a multivariate log-Gaussian Cox process. Article. Full-text ...Missing: invasive | Show results with:invasive
  28. [28]
    Fitting Log-Gaussian Cox Processes Using Generalized Additive ...
    Log-Gaussian Cox processes (Møller, Syversveen, and Waagepetersen Citation1998) are commonly used instead, adding a latent Gaussian random field to the ...<|control11|><|separator|>
  29. [29]
    LGCPs in R-INLA: A case study with type 1 diabetes in Switzerland
    In this document we will perfom disease mapping with log-Gaussian Cox processes. A great bottleneck on fitting such models is that in most cases we have not ...<|control11|><|separator|>
  30. [30]
    Markov Model for the Term Structure of Credit Risk Spreads
    This article provides a Markov model for the term structure of credit risk spreads. The model is based on Jarrow and Turnbull (1995).
  31. [31]
    Pricing of catastrophe reinsurance and derivatives using the Cox ...
    The shot noise process is used for the claim intensity function within the Cox process. The Cox process with shot noise intensity is examined by piecewise ...
  32. [32]
    Modeling High-Frequency Non-Homogeneous Order Flows by ...
    Feb 22, 2016 · Within this model, order flows are described by doubly stochastic Poisson processes (also called Cox processes) taking account of the stochastic ...