In stochastic analysis, a predictable process is a stochastic process that is measurable with respect to the predictable σ-algebra \mathcal{P} on [0, \infty) \times \Omega, which is generated by all left-continuous processes adapted to a given filtration (\mathcal{F}_t)_{t \geq 0}.[1] This σ-algebra captures events and processes that can be "predicted" based on information available strictly before each time t, formalized in discrete time by sets of the form (n-1, n] \times A where A \in \mathcal{F}_{n-1}, or more generally by the predictable rectangles (s,t] \times A with A \in \mathcal{F}_s.[1] Predictable processes encompass a broad class of stochastic phenomena, including all continuous adapted processes, which are predictable due to their left-continuity at every point, and left-continuous adapted processes themselves, such as indicator processes of the form X_t = 1_{\{t \geq \tau\}} where \tau is a stopping time.[2][3]Predictable processes are distinguished from optional processes, which are measurable with respect to the optional σ-algebra generated by right-continuous adapted processes, and every predictable process is both progressive and optional, ensuring measurability and adaptation properties essential for pathwise analysis.[1] For increasing processes, predictability is equivalent to being "natural," meaning the process has no jumps at unpredictable times, a property that aligns with the compensator in Doob-Meyer decomposition for submartingales.[1] Examples include simple predictable processes used in the initial construction of stochastic integrals, such as step functions constant on intervals (s,t] with values measurable at time s, which can be extended to more general predictable integrands via limits.[1]The concept is fundamental to stochastic integration theory, where the integrand must be predictable to define the Itô or Stratonovich integral with respect to semimartingales, ensuring the resulting process is a local martingale or semimartingale with controlled variation.[1] This predictability requirement avoids anticipating future information, preserving the martingale property and enabling applications in mathematical finance, such as pricing derivatives via risk-neutral measures, and in filtering theory for partially observed systems.[2] In continuous-time models like Brownian motion, predictable processes facilitate the martingale representation theorem, decomposing adapted square-integrable martingales as stochastic integrals against the Brownian motion.[3]
Preliminaries
Filtered probability spaces
A filtered probability space is a quadruple (\Omega, \mathcal{F}, (\mathcal{F}_t)_{t \geq 0}, P), where (\Omega, \mathcal{F}, P) is a complete probability space and (\mathcal{F}_t)_{t \geq 0} is a filtration consisting of an increasing family of sub-\sigma-algebras of \mathcal{F} that satisfies the usual conditions: completeness, meaning all P-null sets belong to \mathcal{F}_0, and right-continuity, meaning \mathcal{F}_t = \bigcap_{u > t} \mathcal{F}_u for each t \geq 0. These conditions ensure that the filtration captures a stable and exhaustive representation of information as time progresses without pathological discontinuities.Filtrations play a central role in probability theory by modeling the gradual accumulation and revelation of information over time, where \mathcal{F}_t denotes the \sigma-algebra generated by all events observable up to time t.[4] This structure allows for the analysis of how uncertainty resolves dynamically, providing the foundational setup for studying time-dependent phenomena in stochastic systems. For instance, in financial modeling or physical processes, the filtration evolves to reflect increasing knowledge from observations up to the current instant.In the discrete-time analogue, a filtered probability space is equipped with a filtration (\mathcal{F}_n)_{n \in \mathbb{N}}, an increasing sequence of sub-\sigma-algebras of \mathcal{F} satisfying completeness (all P-null sets in \mathcal{F}_0) and the discrete version of right-continuity, adapted to the countable time index. This setup mirrors the continuous case but is tailored for processes indexed by natural numbers, facilitating applications in sequential decision-making or time-series analysis.
Adapted stochastic processes
A stochastic process X = (X_t)_{t \geq 0} defined on a filtered probability space (\Omega, \mathcal{F}, (\mathcal{F}_t)_{t \geq 0}, P) is said to be adapted to the filtration (\mathcal{F}_t)_{t \geq 0} if, for every t \geq 0, the random variable X_t: \Omega \to \mathbb{R} is \mathcal{F}_t-measurable.[2] This measurability condition ensures that the value of the process at time t depends solely on the information accumulated up to and including time t, as encoded by the sigma-algebra \mathcal{F}_t. The filtration itself represents an increasing family of sigma-algebras modeling the evolution of available information over time.[5]In the discrete-time setting, a stochastic process X = (X_n)_{n \in \mathbb{N}_0} is adapted to a discrete filtration (\mathcal{F}_n)_{n \in \mathbb{N}_0} if X_n is \mathcal{F}_n-measurable for each n \geq 0.[6] This formulation parallels the continuous case, with the index set restricted to non-negative integers and the filtration stepping forward at each integer time.Adapted processes form the foundational class for modeling systems where outcomes at any time are consistent with the prevailing information structure, such as in financial modeling or risk assessment, where X_t might represent an asset price known at time t.[5] However, mere adaptedness does not ensure that the process value at t can be anticipated from information strictly prior to t, as it allows for sudden changes like jumps occurring exactly at t that are not foreseeable beforehand. This distinction underscores adapted processes as a prerequisite for more refined classes that impose stronger foreseeability conditions.
Definition
Discrete-time processes
In the context of a filtered probability space (\Omega, \mathcal{F}, (\mathcal{F}_n)_{n \geq 0}, P), where (\mathcal{F}_n)_{n \geq 0} is a filtration, a discrete-time stochastic process (X_n)_{n \in \mathbb{N}_0} is predictable if X_0 is \mathcal{F}_0-measurable and X_n is \mathcal{F}_{n-1}-measurable for all n \geq 1. This condition ensures that the value of the process at each step n \geq 1 is fully determined by the information available up to the previous time step n-1.This definition is equivalent to viewing the predictable process as a sequence of random variables where each X_n (for n \geq 1) belongs to the sigma-algebra \mathcal{F}_{n-1}, meaning the future value is "foreseeable" based on the history up to the immediate past. Predictable processes form a strict subclass of adapted processes, which only require X_n to be \mathcal{F}_n-measurable at each n.The collection of all predictable processes is closed under pointwise limits and forms the smallest vector space containing all adapted left-continuous processes. In discrete time, left-continuity for an adapted process implies that the value at time n coincides with the limit from the left, which aligns with the \mathcal{F}_{n-1}-measurability condition.
Continuous-time processes
In continuous-time stochastic processes, predictability extends the discrete-time concept to handle uncountable time indices over [0, \infty), addressing the challenges of dense time through measurability with respect to a specialized sigma-algebra, in contrast to the finite-step verification used in discrete settings.[7] This framework ensures that the process's values are "announced" by the filtration just prior to each time, facilitating applications like stochastic integration where foresight into infinitesimal future increments is precluded.A stochastic process X = (X_t)_{t \geq 0} defined on a filtered probability space (\Omega, \mathcal{F}, (\mathcal{F}_t)_{t \geq 0}, P) is predictable if the map (\omega, t) \mapsto X_t(\omega) from \Omega \times [0, \infty) to \mathbb{R} is measurable with respect to the predictable sigma-algebra \mathcal{P}.[7] The sigma-algebra \mathcal{P} is generated by the collection of stochastic rectangles of the form (s, t] \times A, where $0 \leq s < t < \infty and A \in \mathcal{F}_s.[7] This generation captures sets where membership depends on information available strictly before time t, ensuring the process's predictability aligns with the filtration's progressive structure.An alternative characterization states that a process is predictable if and only if it is the pointwise limit of a sequence of left-continuous adapted processes.[8] Left-continuity implies that the process value at each time t is determined by the limit from the left, making it measurable with respect to \mathcal{P}, and the closure under limits preserves this property.
Properties
Predictable sigma-algebra
The predictable sigma-algebra \mathcal{P}, denoted on the product space \Omega \times [0,\infty), is the smallest \sigma-algebra with respect to which all adapted left-continuous stochastic processes are measurable. Equivalently, \mathcal{P} is generated by sets of the form (s,t] \times A where A \in \mathcal{F}_s, and \{0\} \times A where A \in \mathcal{F}_0. It can also be generated by the stochastic intervals \llbracket 0, T \rrbracket = \{(\omega, t) : 0 \leq t \leq T(\omega)\} where T is a stopping time with respect to the filtration (\mathcal{F}_t)_{t \geq 0}. This construction ensures that sets in \mathcal{P} capture events whose occurrence can be foreseen based on information available strictly before time t.A key property of \mathcal{P} is its strict inclusion within the progressive \sigma-algebra \mathcal{M}, which is generated by all progressively measurable processes; thus, \mathcal{P} \subsetneq \mathcal{M} in general, reflecting that predictable sets require left-limits in their measurability structure. Predictable sets are characterized by being "announced," meaning for any set E \in \mathcal{P}, there exists a sequence of sets E_n adapted to the filtration up to times approaching from the left such that E_n approximates E with probability converging to 1. This property underscores the anticipatory nature of \mathcal{P}, distinguishing it from broader classes where measurability does not guarantee prior foreseeability.In the discrete-time setting, where time is indexed by \mathbb{N} = \{0, 1, 2, \dots \}, the predictable \sigma-algebra \mathcal{P} reduces to the product \sigma-algebra generated by sets of the form \mathcal{F}_{n-1} \times \{n\} for n \geq 1 and \mathcal{F}_0 \times \{0\}, aligning predictability with measurability with respect to the immediately preceding filtration information. This discrete analog simplifies the continuous-time generators, as left-continuity is vacuously satisfied in finite steps, and stochastic intervals collapse to predictable announcements at integer times.
Relations to other process classes
Predictable processes constitute a strict subclass of optional processes in the theory of stochastic processes. An optional process is measurable with respect to the optional \sigma-algebra \mathcal{O}, generated by càdlàg (right-continuous with left limits) adapted processes.[9] In contrast, the predictable \sigma-algebra \mathcal{P} is generated by left-continuous adapted processes, ensuring that every predictable process is optional, though the reverse inclusion does not hold in general, as there exist optional processes that are not predictable.[9] Predictability implies optionality even under right-continuity conditions, but optional processes may lack the "foreseeable" quality of predictability, such as accessibility just prior to time t.[9]All predictable processes are also progressive. A progressive process is measurable with respect to the progressive \sigma-algebra \mathcal{M}, which properly contains \mathcal{P} and is generated by adapted processes that are Borel-measurable on [0,t] \times \Omega with respect to \mathcal{B}([0,t]) \otimes \mathcal{F}_t for each t \geq 0.[9] This inclusion \mathcal{P} \subset \mathcal{M} guarantees that predictable processes inherit the joint measurability properties of progressive processes, facilitating their use in time-dependent filtrations.[9]In stochastic calculus, càdlàg adapted processes are optional (and hence progressive), but predictability is often assumed for such processes when they serve as integrands in stochastic integrals with respect to semimartingales, ensuring the integral's well-definedness and semimartingale properties.[10] This assumption leverages the left-continuous nature of predictable generators to avoid issues with jumps at unpredictable times.[10]
Examples
Predictable examples
Constant processes serve as fundamental examples of predictable processes. A constant stochastic process X_t = c for all t \geq 0, where c is an \mathcal{F}_0-measurable random variable, is adapted to the filtration and left-continuous with respect to time, thereby belonging to the predictable \sigma-algebra.\] Such processes are predictable because their value at any time $t$ is fully determined by the information available at time 0, satisfying the foreseeability condition inherent in the definition of predictability.\[Left-continuous adapted processes provide another broad class of predictable processes. For instance, the integrated Brownian motion defined by X_t = \int_0^t B_s \, ds, where B = (B_t)_{t \geq 0} is a standard Brownian motion with respect to its natural filtration, is adapted and possesses continuous paths almost surely, hence left-continuous.\] This continuity ensures that $X_t$ is measurable with respect to the predictable $\sigma$-algebra generated by left-continuous adapted processes.\[In discrete time, predictable processes are those where each increment is measurable with respect to the preceding \sigma-algebra. A representative example is the indicator process X_n = 1_{\{A_n\}} for n \geq 1, where A_n \in \mathcal{F}_{n-1}, making X_n \mathcal{F}_{n-1}-measurable and thus predictable, as its value is known based on the information up to time n-1.$$]Compensators of point processes illustrate predictable processes in the context of counting phenomena. For a counting process N = (N_t)_{t \geq 0}, its compensator A = (A_t)_{t \geq 0} is the unique increasing predictable process such that N - A is a martingale with respect to the filtration.[$$ This compensator coincides with the dual predictable projection of N, ensuring it captures the foreseeable component of the jumps in N.[]
Non-predictable counterexamples
In discrete time, a canonical counterexample of an adapted process that fails to be predictable is given by the indicator X_n = 1_{\{Z_n > 0\}}, where (Z_n) is a sequence of random variables such that Z_n is \mathcal{F}_n-measurable but not \mathcal{F}_{n-1}-measurable for each n \geq 1. This process is adapted to the filtration (\mathcal{F}_n)_{n \geq 0} because each X_n is \mathcal{F}_n-measurable, reflecting information available up to time n. However, it is not predictable, as predictability in discrete time requires X_n to be \mathcal{F}_{n-1}-measurable, allowing anticipation from the prior sigma-algebra; here, the indicator captures an event revealed precisely at time n, unannounced beforehand.In continuous time, the indicator process X_t = 1_{\{N_t \geq 1\}}, where N = (N_t)_{t \geq 0} is a standard Poisson process with rate \lambda > 0 driven by its natural filtration, provides an adapted but non-predictable example. The process X is adapted, as X_t depends only on the history of N up to time t, and right-continuous with left limits (càdlàg). Yet, it is not predictable because the first jump time \tau = \inf\{ t \geq 0 : N_t \geq 1 \} is a totally inaccessible stopping time, meaning no predictable stopping time can anticipate it with positive probability, violating the left-continuity or measurability requirements for the predictable sigma-algebra.[11]A classic illustration of an optional process that is not predictable arises from totally inaccessible stopping times, such as the process X_t = 1_{[\tau, \infty)}(t), where \tau is the first jump time of a Poisson process as above. This X is optional, being right-continuous and adapted to the filtration, which aligns with the optional sigma-algebra generated by such processes. Nevertheless, it fails predictability because the jump at \tau cannot be foreseen from left-limits or prior information, distinguishing optional from the stricter predictable class.[12]
Applications
Stochastic integration
In stochastic calculus, the Itô integral is defined for integrands that are predictable processes to guarantee the well-definedness of the stochastic integral \int H \, dM with respect to a semimartingale M.[13] This requirement stems from the need to construct the integral first for simple predictable processes—such as those constant on intervals (s, t] \times \Omega where s is a stopping time—and then extend it by limits to the broader class of predictable processes, ensuring measurability and adaptedness with respect to the predictable \sigma-algebra.[14]Predictability of the integrand H_t prevents "looking ahead" by ensuring that the value at time t depends only on information available strictly before t, thus avoiding anticipative strategies that could exploit future increments of the integrator.[15] This property aligns the stochastic integral with non-anticipative decision-making, preserving the martingale properties and independence assumptions inherent in processes like Brownian motion.[13]A key application arises in the Black-Scholes model for option pricing, where trading strategies—representing the number of shares held in the replicating portfolio—must be predictable processes to model non-anticipative portfolios that adjust based solely on past and present market information.[16] This ensures the self-financing condition and the absence of arbitrage, allowing the model's partial differential equation to derive fair option prices under the risk-neutral measure.[17]
Martingale representations
The Doob-Meyer decomposition theorem asserts that any cadlag submartingale belonging to class (D) admits a unique decomposition into a martingale plus an increasing predictable process, termed the compensator. This separation isolates the martingale component, which captures the unpredictable fluctuations, from the predictable increasing component, which accounts for the foreseeable trend. The predictability of the compensator is essential, as it belongs to the predictable σ-algebra, ensuring the decomposition's uniqueness and enabling applications in stochastic analysis.In the theory of point processes, predictable compensators arise naturally through the intensity process. For a point process N, the compensator A is the unique predictable increasing process such that M_t = N_t - A_t is a martingale, often expressed as A_t = \int_0^t \lambda_s \, ds, where \lambda denotes the predictable intensity. This construction guarantees that the compensated process M has zero mean increments, reflecting the martingale property, while the predictability of \lambda allows anticipation of jump occurrences based on past information. Such representations are fundamental for modeling counting processes in reliability and queueing theory.Predictability further facilitates the extension of martingale properties via optional sampling. In Doob's optional sampling theorem, for a martingale and a bounded predictable stopping time, the stopped process remains a martingale, preserving expectations across the stopping time. This property leverages the foreseeable nature of predictable stopping times, which are announced by a sequence of prior times, to validate sampling without introducing bias from unforeseen jumps.