Fact-checked by Grok 2 weeks ago

Adapted process

In and , an adapted process is a \{X_t\}_{t \in T} defined on a filtered (\Omega, \mathcal{F}, P, \{\mathcal{F}_t\}_{t \in T}) such that, for every t \in T, the X_t is \mathcal{F}_t-measurable. This measurability condition ensures that the process value at time t depends only on the accumulated up to that time, as captured by the \{\mathcal{F}_t\}, which is a non-decreasing of sigma-algebras representing the evolution of available . Adapted processes form the foundational framework for non-anticipating models in stochastic systems, where future does not influence present outcomes. The concept of adaptedness is central to , enabling the construction of stochastic integrals where the integrand must be adapted (and often predictable) to avoid lookahead bias in the integration process. For instance, in the Itô integral, adapted integrands ensure that the resulting process is a under suitable conditions, which is crucial for modeling continuous-time phenomena like asset prices in . Adapted processes also underpin the definition of martingales, which are adapted processes satisfying \mathbb{E}[X_t \mid \mathcal{F}_s] = X_s for s \leq t, providing tools for optional sampling and change-of-measure techniques in risk-neutral pricing. Beyond core theory, adapted processes appear in diverse applications, including , , and under partial observations, where the models incomplete data revelation over time. Extensions like progressively measurable or optional processes build on adaptedness to handle path properties and stopping times, facilitating advanced results in decomposition and differential equations. This structure allows rigorous analysis of randomness while respecting temporal causality, making adapted processes indispensable in modern probabilistic modeling.

Fundamentals

Definition

An adapted process, also known as an , is a fundamental concept in and stochastic . Formally, consider a (\Omega, \mathcal{F}, P) equipped with a (\mathcal{F}_t)_{t \geq 0}, which is an increasing family of sub-\sigma-algebras of \mathcal{F} representing the cumulative information available up to time t. A (X_t)_{t \geq 0} with values in a measurable space is adapted to this filtration if, for every t \geq 0, the random variable X_t: \Omega \to \mathbb{R} (or more generally to the state space) is \mathcal{F}_t-measurable. This measurability condition intuitively ensures that the process does not "anticipate" future ; at any time t, the value X_t(\omega) for each outcome \omega \in \Omega is determined solely by the events in \mathcal{F}_t, reflecting the accumulated by time t. In the discrete-time setting, where the process is indexed by n \in \mathbb{N}_0 = \{0, 1, 2, \dots\} and the filtration is (\mathcal{F}_n)_{n \in \mathbb{N}_0}, adaptedness requires that X_n is \mathcal{F}_n-measurable for each n. The continuous-time case extends this to indices t \in \mathbb{R}^+, often assuming the filtration satisfies standard regularity conditions such as right-continuity, meaning \mathcal{F}_t = \bigcap_{s > t} \mathcal{F}_s for each t \geq 0, to handle path properties and integration effectively.

Filtrations

In probability theory, a filtration \{\mathcal{F}_t\}_{t \geq 0} on a probability space (\Omega, \mathcal{F}, P) is defined as a family of sub-σ-algebras of \mathcal{F} indexed by time t \geq 0, satisfying the monotonicity condition \mathcal{F}_s \subseteq \mathcal{F}_t whenever s \leq t. This increasing structure ensures that the information encoded in the σ-algebras grows non-decreasingly over time. Typically, \mathcal{F}_0 is taken to contain all P-null sets, and the filtration is assumed to be right-continuous, meaning \mathcal{F}_t = \bigcap_{u > t} \mathcal{F}_u for each t \geq 0. Right-continuity captures the idea that the information at time t includes all details available immediately after t, refining the filtration to avoid discontinuities in the information flow. The natural filtration generated by a stochastic process X = (X_t)_{t \geq 0} is the smallest filtration making X adapted, constructed as \mathcal{F}_t = \sigma(X_s : s \leq t) for each t \geq 0, where \sigma(\cdot) denotes the generated by the specified events. This filtration is the minimal one containing all information revealed by the process up to time t. To address issues of indistinguishability between processes that agree , augmented filtrations are employed. An augmented filtration is both complete—meaning each \mathcal{F}_t includes all P- sets—and right-continuous; the minimal augmented filtration for a process X is obtained by first forming \mathcal{F}^0_t = \sigma(\{X_s : s \leq t\} \cup \mathcal{N}), where \mathcal{N} is the collection of P- sets, and then setting \mathcal{F}_t = \bigcap_{\epsilon > 0} \mathcal{F}^0_{t + \epsilon}. Filtrations play a central role in modeling the evolution of in systems, where \mathcal{F}_t represents the accumulated available up to time t, enabling the of how resolves over time. A is adapted to a if X_t is \mathcal{F}_t-measurable for each t \geq 0.

Properties

Adaptedness conditions

A stochastic process X = (X_t)_{t \geq 0} defined on a probability space (\Omega, \mathcal{F}, P) equipped with a filtration (\mathcal{F}_t)_{t \geq 0} is adapted to the filtration if, for every t \geq 0, the random variable X_t is \mathcal{F}_t-measurable. This measurability condition is equivalent to requiring that, for all t \geq 0 and all real numbers x, the set \{\omega \in \Omega : X_t(\omega) \leq x\} belongs to \mathcal{F}_t. The condition ensures that the value of X_t is fully determined by the information available up to time t, as captured by the sigma-algebra \mathcal{F}_t. In the context of right-continuous filtrations, where \mathcal{F}_t = \bigcap_{s > t} \mathcal{F}_s for each t \geq 0, adaptedness is preserved under limits and s. Specifically, if a X is adapted to (\mathcal{F}_t) and the filtration satisfies right-continuity, then X remains adapted to the usual augmentation or of the filtration, which includes all sets and ensures robustness to time shifts. This property is crucial for maintaining adaptedness when modifying the filtration to include of probability zero without altering the process's informational structure. For left-continuous processes, which have sample paths that are continuous from the left (with possible right-limits), adaptedness extends to their left-limits under suitable conditions. An adapted process with left-continuous paths is progressively measurable, meaning the map (s, \omega) \mapsto X_s(\omega) for s \in [0, t] is measurable with respect to the product sigma-algebra \mathcal{B}([0, t]) \otimes \mathcal{F}_t. Moreover, the left-limit process X_{t-} = \lim_{s \uparrow t} X_s inherits adaptedness to the left-continuous \mathcal{F}_{t-} = \sigma(\bigcup_{s < t} \mathcal{F}_s), preserving the process's measurability properties across discontinuities from the right. Verification of adaptedness for constructed processes often relies on the monotone class theorem or properties of infinitesimal generators. The functional monotone class theorem can confirm adaptedness by showing that a class of simple adapted processes (e.g., indicators or linear combinations) is closed under pointwise limits, extending the property to more complex processes like exponentials or solutions to stochastic equations. In Markov process settings, the infinitesimal generator \mathcal{L} of the semigroup can be used to verify adaptedness: if the process satisfies \mathbb{E}[f(X_t) \mid \mathcal{F}_s] = (e^{(t-s)\mathcal{L}} f)(X_s) for bounded functions f and s < t, and the generator preserves \mathcal{F}_s-measurability, then the process is adapted to the filtration. These techniques ensure rigorous confirmation without exhaustive pathwise checks. Adapted processes represent the minimal measurability condition in stochastic analysis, requiring that X_t is \mathcal{F}_t-measurable for each t \geq 0 on a filtered probability space (\Omega, \mathcal{F}, \{\mathcal{F}_t\}_{t \geq 0}, P). More restrictive classes impose additional structure on the joint measurability of the process over time intervals, enabling applications like stochastic integration and optional sampling. These include predictable, optional, and progressively measurable processes, each defined via specific \sigma-algebras on \Omega \times [0, \infty). Predictable processes are those measurable with respect to the predictable \sigma-algebra \mathcal{P}, generated by left-continuous adapted processes (or, equivalently, by stochastic intervals [[0, \tau]] for stopping times \tau and sets of the form (s, t] \times F with F \in \mathcal{F}_s). This class ensures that the process value at time t depends only on information strictly before t, making predictable processes suitable as integrands in stochastic integrals with respect to , where they guarantee the integral's well-definedness and adaptedness. Left-continuous adapted processes are inherently predictable, but the converse requires the additional predictability condition. Optional processes are measurable with respect to the optional \sigma-algebra \mathcal{O}, generated by right-continuous adapted processes with left limits (càdlàg processes). This structure captures processes that can be evaluated at stopping times without anticipating future information, playing a key role in the optional sampling theorem for martingales, where the value at an optional time \tau remains a martingale. Right-continuous adapted processes are optional, but predictability further refines this to left-continuous versions. Progressively measurable processes satisfy the condition that, for every t > 0, the map (s, \omega) \mapsto X_s(\omega) for s \in [0, t] is \mathcal{B}([0, t]) \otimes \mathcal{F}_t-measurable, where \mathcal{B}([0, t]) is the Borel \sigma-algebra on [0, t]. This is equivalent to the process being adapted and having paths that are jointly measurable with respect to the product \sigma-algebra \mathcal{B}([0, \infty)) \otimes \mathcal{F}. Progressively measurable processes ensure that operations like pathwise integration or stopping yield adapted results, and continuous adapted processes fall into this class. The classes form a : all predictable are optional and progressively measurable, all optional are progressively measurable, and all progressively measurable are adapted, but the inclusions are strict. For instance, the indicator process of a non-predictable provides a progressively measurable but non-predictable example. This reflects increasing levels of regularity, with predictability offering the strongest "foreseeability" for applications in .

Applications

Stochastic integration

In stochastic integration, adapted processes serve as integrands for defining integrals with respect to semimartingales, ensuring the resulting processes remain non-anticipating with respect to the underlying filtration. The Itô integral \int_0^t H_s \, dX_s is constructed for an adapted stochastic process H = (H_t) integrated against a semimartingale X, where the integral is defined pathwise as the limit of sums using left-endpoint evaluations of H. Adaptedness of H is essential, as it guarantees that the integral qualifies as a local martingale whenever X is a local martingale, preserving the martingale property under the natural filtration. The non-anticipating property inherent to adapted integrands prevents the use of future information in the of the , which is crucial for applications in where strategies must rely solely on information available up to the current time to ensure fair pricing and avoid opportunities. This requirement aligns with the foundational principles of , where predictability or adaptedness enforces causal dependence on past and present observations. For the Itô integral to exist, the adapted process H must additionally satisfy the square-integrability condition \mathbb{E}\left[ \int_0^t H_s^2 \, d\langle X \rangle_s \right] < \infty, where \langle X \rangle denotes the quadratic variation process of X; this ensures the L^2 boundedness necessary for the limit to converge in probability. The Stratonovich integral, an alternative formulation using midpoint evaluations in the approximating sums, similarly requires adapted integrands to maintain the non-anticipating framework, though it differs from the Itô integral by incorporating a correction term related to covariation. The framework extends naturally to semimartingales, where adapted processes enable integration against processes with jumps by incorporating the compensator in the decomposition; this generalization, developed through the theory of predictable processes, allows the stochastic integral to handle both continuous and discontinuous paths while preserving key properties like the L^2 isometry for martingale parts.

Martingale theory

In martingale theory, an adapted stochastic process (M_t)_{t \geq 0} with respect to a filtration (\mathcal{F}_t)_{t \geq 0} is defined as a martingale if it satisfies the integrability condition \mathbb{E}[|M_t|] < \infty for all t \geq 0 and the martingale property \mathbb{E}[M_t \mid \mathcal{F}_s] = M_s almost surely for all $0 \leq s \leq t. This conditional expectation equality relies fundamentally on the adaptedness of M_t, ensuring that M_t is \mathcal{F}_t-measurable, which allows the expectation to incorporate the information available up to time s without anticipating future revelations. A prominent example is Doob's martingale, constructed from a fixed integrable random variable X as the sequence of conditional expectations M_t = \mathbb{E}[X \mid \mathcal{F}_t]. This process is inherently adapted to the filtration (\mathcal{F}_t) because each M_t is \mathcal{F}_t-measurable by the definition of conditional expectation, and it satisfies the martingale property since \mathbb{E}[M_t \mid \mathcal{F}_s] = \mathbb{E}[\mathbb{E}[X \mid \mathcal{F}_t] \mid \mathcal{F}_s] = \mathbb{E}[X \mid \mathcal{F}_s] = M_s for s \leq t. Doob's construction highlights how adaptedness ensures the process captures progressively refined expectations of X. Extensions of martingales include submartingales and supermartingales, which relax the equality in the conditional expectation. A submartingale satisfies \mathbb{E}[M_t \mid \mathcal{F}_s] \geq M_s almost surely for s \leq t, with the same integrability conditions, while a supermartingale satisfies the reverse inequality \mathbb{E}[M_t \mid \mathcal{F}_s] \leq M_s. Adaptedness remains essential for these definitions, as it guarantees the measurability required for the conditional expectations to be well-defined with respect to the filtration. The optional sampling theorem further underscores the role of adaptedness in preserving martingale properties under stopping. For a right-continuous martingale (M_t) and a bounded stopping time \tau adapted to the filtration, the theorem states that (M_{\tau \wedge t})_{t \geq 0} is also a martingale. This preservation holds because the adaptedness and right-continuity of the process ensure that M_\tau is integrable and the conditional expectations align properly at stopped times.

Examples

Brownian motion

Brownian motion, also known as the , provides a fundamental example of an adapted stochastic process, illustrating key concepts in the theory of filtrations and process adaptation. A standard one-dimensional \{W_t\}_{t \geq 0} is defined on a probability space (\Omega, \mathcal{F}, P) as a stochastic process with W_0 = 0 almost surely, independent increments, and such that the increment W_t - W_s follows a Gaussian distribution \mathcal{N}(0, t-s) for all $0 \leq s < t. Additionally, it possesses continuous sample paths with probability 1. By definition, the Brownian motion \{W_t\}_{t \geq 0} is adapted to its natural filtration \mathcal{F}_t^W = \sigma(W_s : 0 \leq s \leq t), the \sigma-algebra generated by the process up to time t, since W_t is \mathcal{F}_t^W-measurable for each t \geq 0. This natural filtration captures the information revealed by the process's history and, when completed with null sets, is right-continuous, meaning \mathcal{F}_t^W = \bigcap_{u > t} \mathcal{F}_u^W. The sample paths of Brownian motion exhibit remarkable irregularity: while they are continuous almost surely, they are nowhere differentiable almost surely, reflecting the intrinsic roughness of the process despite its continuity. This non-differentiability holds at every point in [0, \infty) with probability 1. To satisfy the usual conditions in stochastic analysis, the natural filtration is often augmented by the \sigma-algebra of null sets with respect to P, ensuring completeness (every null set is measurable) while preserving right-continuity. This augmented filtration \tilde{\mathcal{F}}_t^W maintains the adaptedness of \{W_t\} and facilitates applications such as stochastic integration, where Brownian motion acts as the driving noise.

Poisson processes

A Poisson process is a counting process \{N_t\}_{t \geq 0} that starts at N_0 = 0, has independent increments, and satisfies N_t - N_s \sim \mathrm{Poisson}(\lambda (t - s)) for $0 \leq s < t, where \lambda > 0 is the constant intensity parameter. The paths of N_t are right-continuous with left limits, increasing only at jump times where increments are of size 1, modeling the cumulative number of events occurring randomly over time. The process is adapted to its natural filtration \mathcal{F}_t^N = \sigma(N_s : s \leq t), meaning N_t is \mathcal{F}_t^N-measurable for each t, which captures the information revealed by the process up to time t. The jumps occur at stopping times that are totally inaccessible with respect to this filtration, ensuring that the exact timing of future jumps cannot be predicted based on the past trajectory, a property arising from the memoryless exponential distribution of interarrival times. The of the Poisson process is the deterministic process A_t = \lambda t, which represents the expected cumulative up to time t. The compensated process M_t = N_t - A_t is then a martingale with respect to \mathcal{F}_t^N, preserving the adaptedness of N_t while centering its increments to have mean zero. The natural filtration generated by the process does not anticipate jump times, meaning it lacks advance information about future discontinuities, aligning with frameworks where such filtrations avoid predictability of jumps.