Fact-checked by Grok 2 weeks ago

Stochastic calculus

Stochastic calculus is a branch of mathematics that extends classical calculus to handle stochastic processes, particularly through the development of integrals and differential equations involving random phenomena such as Brownian motion. It provides essential tools for modeling and analyzing systems influenced by uncertainty, with foundational concepts including the Itô integral and stochastic differential equations (SDEs). The origins of stochastic calculus trace back to the late 19th and early 20th centuries, beginning with Thorvald Nicolai Thiele's 1880 modeling of Brownian motion for time series analysis. Louis Bachelier's 1900 thesis applied Brownian motion to stock price fluctuations, introducing the idea of independent, normally distributed increments in financial markets. Albert Einstein's 1905 physical interpretation of Brownian motion further solidified its theoretical basis, while Norbert Wiener's 1923 rigorous construction using measure theory formalized the Wiener process. Andrey Kolmogorov's 1931 work on Markov processes connected diffusions to partial differential equations, laying groundwork for later developments. The pivotal advancement came in 1944 with Kiyosi Itô's introduction of stochastic integration, followed by his 1951 formulation of Itô's lemma, which enables differentiation of stochastic processes and is central to solving SDEs. By the 1960s and 1970s, contributions from Paul-André Meyer and others, including the Doob-Meyer decomposition in 1962 and the concept of semimartingales in 1970, broadened the theory beyond Markov processes, establishing stochastic calculus as a robust framework. At its core, stochastic calculus revolves around stochastic processes, which are collections of random variables evolving over time, defined on a probability space (\Omega, \mathcal{F}, P) with a filtration \{\mathcal{F}_t\} representing accumulating information. The Wiener process, or Brownian motion, serves as the canonical example: a continuous-time process with independent, normally distributed increments W_t - W_s \sim \mathcal{N}(0, t-s) for t > s, exhibiting quadratic variation [W, W]_t = t. The Itô integral, \int_0^t h_s \, dW_s, extends integration to non-deterministic integrands h adapted to the filtration, defined as an L^2 limit for simple processes and possessing properties like zero mean (\mathbb{E}[\int h \, dW] = 0) and Itô isometry (\mathbb{E}[(\int h \, dW)^2] = \int \mathbb{E}[h^2] \, ds). This integral underpins stochastic differential equations of the form dX_t = b(t, X_t) \, dt + \sigma(t, X_t) \, dW_t, where b is the drift and \sigma the diffusion coefficient, solved using Itô's lemma—a chain rule analogue that accounts for the quadratic variation of the Wiener process: for f(t, X_t), df = \frac{\partial f}{\partial t} dt + \frac{\partial f}{\partial x} dX + \frac{1}{2} \frac{\partial^2 f}{\partial x^2} (dX)^2, with (dX)^2 = \sigma^2 dt. Applications of stochastic calculus span multiple fields, most notably mathematical finance, where it derives the Black-Scholes equation for option pricing under the geometric Brownian motion model dS_t = \mu S_t \, dt + \sigma S_t \, dW_t, enabling risk-neutral valuation and hedging strategies. In physics and engineering, it models particle diffusion, noise in signal processing, and optimal control problems, such as minimizing costs in SDEs via the Hamilton-Jacobi-Bellman equation. Filtering theory, including the Kalman filter extension to nonlinear cases, uses stochastic calculus for state estimation in noisy environments, as in tracking or navigation systems. Additionally, it connects to partial differential equations through Feynman-Kac representations, linking SDEs to solutions of parabolic PDEs like the heat equation. These tools have transformed quantitative modeling, emphasizing martingales—processes with constant conditional expectation—for fair pricing and no-arbitrage principles.

Introduction

Overview

Stochastic calculus is the branch of mathematics that extends the methods of calculus to stochastic processes, particularly emphasizing integration and differentiation in environments characterized by randomness. It provides tools for analyzing systems where outcomes are probabilistic rather than deterministic, enabling the rigorous treatment of uncertainty in continuous time. The core components of stochastic calculus include stochastic integrals, which define integration with respect to random processes; stochastic differential equations (SDEs), which model dynamics driven by noise; and Itô's lemma, a fundamental theorem analogous to the chain rule but adapted for stochastic settings. These elements allow for the manipulation of expressions involving randomness, such as computing expectations or solving equations under uncertainty. In contrast to deterministic calculus, which assumes smooth and differentiable paths, stochastic calculus addresses the irregularities of random paths, such as those with infinite variation but finite quadratic variation, exemplified by Brownian motion. This distinction necessitates new definitions and rules to handle the non-differentiability inherent in noise-driven evolutions. Stochastic calculus is essential for modeling phenomena with intrinsic randomness, including financial markets where asset prices fluctuate unpredictably and physical processes like particle diffusion. A representative example is the basic SDE dX_t = \mu \, dt + \sigma \, dW_t, where \mu \, dt captures the deterministic drift, \sigma \, dW_t the random volatility term, and W_t the Wiener process representing the noise source.

Historical development

The foundations of stochastic calculus emerged in the early 20th century, building on probabilistic models of random phenomena. In 1900, Louis Bachelier presented his doctoral thesis Théorie de la spéculation, which modeled stock price fluctuations as an arithmetic Brownian motion, introducing the concept of a continuous-time random walk to financial mathematics for the first time. This work laid an early groundwork for applying stochastic processes to economic systems, though it initially received limited attention. Five years later, Albert Einstein's seminal paper on the Brownian motion of suspended particles provided a physical interpretation through the lens of molecular diffusion, rigorously deriving the mean squared displacement proportional to time and connecting microscopic chaos to macroscopic randomness. The 1920s and 1930s saw mathematical formalization that enabled a rigorous framework for stochastic analysis. Norbert Wiener constructed the Wiener process in 1923, defining Brownian motion as a continuous but nowhere differentiable path in a probabilistic space, which became the canonical model for random fluctuations. This was complemented by Andrey Kolmogorov's 1933 axiomatization of probability theory, which provided the measure-theoretic foundations necessary for handling infinite-dimensional path spaces and ensuring the consistency of stochastic integrals. During World War II, practical needs in electronics and signal processing, such as modeling thermal noise in circuits and radar interference, accelerated research into stochastic processes, influencing the development of tools for noisy dynamical systems. A pivotal breakthrough occurred in 1944 when Kiyosi Itô invented the Itô stochastic integral, motivated by his efforts to solve stochastic differential equations describing physical systems perturbed by Brownian noise, such as turbulence and electronic fluctuations. Itô expanded on this in his 1951 memoir On Stochastic Differential Equations, establishing the calculus for non-anticipating integrands with respect to Brownian motion. In the 1950s, Joseph Doob's Stochastic Processes formalized martingale theory, offering a probabilistic structure essential for convergence and optional sampling in stochastic settings. The 1960s introduced Ruslan Stratonovich's integral, developed around 1961 for applications in physics where ordinary chain rules apply, facilitating modeling in quantum mechanics and control theory. From the 1970s onward, stochastic calculus exploded in applications, particularly in finance and beyond. The 1973 Black-Scholes model for option pricing relied on Itô calculus to derive a partial differential equation for asset prices under geometric Brownian motion, revolutionizing quantitative finance. Later extensions addressed limitations with rougher paths; in the 1990s, Terry Lyons developed rough path theory to generalize stochastic integrals to signals with finite p-variation where p > 2, enabling solutions to equations driven by paths beyond semimartingales. These advancements continue to underpin modern probability and its interdisciplinary uses.

Prerequisites

Stochastic processes

A stochastic process is formally defined as a family of random variables \{X_t : t \in T\}, where T is an index set typically representing time, and each X_t is defined on a common probability space (\Omega, \mathcal{F}, P). This collection describes the evolution of a random phenomenon over time, with realizations known as sample paths or trajectories. In the context of stochastic calculus, continuous-time processes—where T = [0, \infty) or a similar interval—are of primary interest, as they model phenomena like asset prices or physical systems with smooth temporal progression. Stochastic processes are classified based on key properties, such as the Markov property, which states that the future state depends only on the current state and not on the history, leading to Markov processes. Another important class is Lévy processes, characterized by stationary and independent increments, starting at zero almost surely, and having right-continuous paths with left limits (càdlàg). Stationarity refers to the invariance of the process's statistical properties over time shifts; strict stationarity requires the joint distribution of any finite collection of variables to remain unchanged under time translation, while weak (or second-order) stationarity assumes constant mean and autocovariance depending only on the time lag, provided second moments exist. Independent increments mean that the differences X_t - X_s for disjoint intervals (s, t] are independent random variables, a property central to processes like Lévy. Examples illustrate these concepts: the Poisson process, a counting process with independent increments and jumps at random times, models events like arrivals in a queue, where the number of events in an interval follows a Poisson distribution with mean proportional to the interval length. Gaussian processes, where every finite-dimensional distribution is multivariate normal, provide smooth examples with continuous paths, serving as a bridge to processes without jumps. Path properties are crucial for analysis; continuity implies no jumps, while càdlàg paths—right-continuous with left limits—accommodate jumps common in financial modeling, ensuring well-defined limits for stochastic integrals. A fundamental result connecting discrete to continuous processes is that the central limit theorem implies sums of independent random variables, suitably scaled and centered, approximate a Brownian motion in distribution, justifying the use of continuous paths for large-scale random walks. Brownian motion stands out as a key example of a continuous-time stochastic process with these properties, underpinning much of stochastic calculus.

Brownian motion

Brownian motion, also known as the Wiener process, originates from the empirical observation of erratic particle movement in fluids, first systematically documented by the Scottish botanist Robert Brown in 1827 while examining pollen grains under a microscope. This phenomenon was later mathematically formalized by Norbert Wiener in 1923, who provided the first rigorous construction of the process as a continuous-time stochastic model. The standard Brownian motion W = (W_t)_{t \geq 0} is defined on a probability space as a stochastic process starting at W_0 = 0 almost surely, with continuous sample paths with probability 1, independent increments, and normally distributed increments such that for $0 \leq s < t, W_t - W_s \sim \mathcal{N}(0, t - s). The covariance function of the process is given by \mathbb{E}[W_s W_t] = \min(s, t) for s, t \geq 0, which encapsulates its Gaussian nature and the stationary variance of increments. Key path properties distinguish Brownian motion: almost every sample path is continuous but nowhere differentiable, reflecting its infinite variation despite bounded quadratic variation defined as [W]_t = t. Additionally, the process exhibits scaling invariance, where for any c > 0, the rescaled process satisfies W_{ct} \stackrel{d}{=} \sqrt{c} \, W_t in distribution. One standard construction of Brownian motion proceeds as the scaling limit of symmetric random walks on the integers, where the position after n steps approximates W_t for t = n / N as the step size $1/\sqrt{N} tends to zero. Path regularity, including continuity, follows from the Kolmogorov-Chentsov theorem applied to the moments of increments, ensuring the existence of a continuous modification. The process is unique up to its version, meaning any two Brownian motions agreeing on finite-dimensional distributions possess indistinguishable laws; the canonical version lives on the space of continuous functions C[0, \infty), while cadlag versions may be considered on the Skorokhod space D[0, \infty) for broader applications. Brownian motion serves as the foundational driving noise for stochastic integrals and underlies diffusion processes in probabilistic modeling.

Martingales and filtrations

In stochastic processes, a filtration \{\mathcal{F}_t\}_{t \geq 0} is defined as an increasing family of \sigma-algebras on a probability space (\Omega, \mathcal{F}, P), where \mathcal{F}_s \subseteq \mathcal{F}_t for all $0 \leq s \leq t, representing the accumulation of information available up to time t. This structure models the progressive revelation of events in a random system, with \mathcal{F}_0 containing the initial information and \mathcal{F}_t incorporating all observable events by time t. A stochastic process X = (X_t)_{t \geq 0} is said to be adapted to the filtration if, for each t, the random variable X_t is \mathcal{F}_t-measurable, meaning its value at time t depends only on the information up to that time. Central to the theory are martingales, which formalize the notion of a "fair game" in probabilistic terms. A stochastic process M = (M_t)_{t \geq 0} is a martingale with respect to a filtration \{\mathcal{F}_t\}_{t \geq 0} if it is adapted to the filtration, \mathbb{E}[|M_t|] < \infty for all t, and satisfies the conditional expectation property \mathbb{E}[M_t \mid \mathcal{F}_s] = M_s almost surely for all $0 \leq s \leq t. This property implies that the expected future value of the process, given the current information, equals its current value, capturing predictability in an average sense. The optional sampling theorem extends this by stating that, under suitable conditions such as bounded stopping times \tau, the stopped process M_{\tau \wedge t} remains a martingale, allowing evaluation at random times without altering the fairness property. Martingales possess several key properties that underpin their utility in analysis. Doob's maximal inequalities provide bounds on the supremum of the process: for a nonnegative submartingale X and p > 1, \mathbb{E}[\sup_{0 \leq s \leq t} X_s^p]^{1/p} \leq \frac{p}{p-1} \mathbb{E}[X_t^p]^{1/p}, controlling the likelihood of large deviations. The martingale convergence theorem states that if (M_t) is a martingale bounded in L^1 (i.e., \sup_t \mathbb{E}[|M_t|] < \infty), then M_t converges almost surely to some M_\infty \in L^1 as t \to \infty. For convergence in L^1, uniform integrability is required: a family \{|M_t| : t \geq 0\} is uniformly integrable if \sup_t \mathbb{E}[|M_t| \mathbf{1}_{\{|M_t| > K\}}] \to 0 as K \to \infty, ensuring the limit preserves the L^1 norm. Submartingales and supermartingales generalize martingales through inequalities in the conditional expectation. A process X is a submartingale if \mathbb{E}[X_t \mid \mathcal{F}_s] \geq X_s a.s. for s \leq t, modeling processes with a nonnegative drift, while a supermartingale satisfies \mathbb{E}[X_t \mid \mathcal{F}_s] \leq X_s a.s., indicating a nonpositive drift. These definitions extend the convergence and inequality results from martingales, with submartingales converging almost surely under uniform integrability. Classic examples illustrate these concepts. Standard Brownian motion (B_t)_{t \geq 0}, adapted to its natural filtration, is a martingale because \mathbb{E}[B_t \mid \mathcal{F}_s] = B_s for s \leq t, reflecting its zero-drift property. Similarly, the compensated Poisson process M_t = N_t - \lambda t, where N is a Poisson process with rate \lambda and natural filtration, forms a martingale since the compensator \lambda t subtracts the expected increments, yielding \mathbb{E}[M_t \mid \mathcal{F}_s] = M_s. In stochastic calculus, filtrations and martingales provide the foundational framework for constructing integrals and ensuring their well-definedness, as adapted integrands and martingale properties allow the resulting processes to maintain predictability and avoid pathological behaviors. This structure is essential for preserving key probabilistic features in more advanced developments.

Stochastic Integrals

Itô integral

The Itô integral provides a framework for integrating adapted stochastic processes with respect to Brownian motion, addressing the challenges posed by the irregular paths of the latter. For a standard Brownian motion W on a probability space (\Omega, \mathcal{F}, (\mathcal{F}_t), P) with the natural filtration (\mathcal{F}_t), consider progressively measurable processes \phi = \{\phi_t\}_{t \geq 0} that are square-integrable in the sense that \mathbb{E}\left[\int_0^T \phi_t^2 \, dt\right] < \infty for each T > 0. The Itô integral \int_0^t \phi_s \, dW_s is constructed as the \mathbb{L}^2(P)-limit of approximating sums using left-endpoint evaluations to ensure non-anticipating behavior. The construction begins with simple processes, which are linear combinations of indicator functions of the form \phi_t = \sum_{i=1}^n c_i(\omega) \mathbf{1}_{[t_{i-1}, t_i)}(t), where $0 = t_0 < t_1 < \cdots < t_n = t, the c_i are \mathcal{F}_{t_{i-1}}-measurable, and \phi satisfies the square-integrability condition. For such processes, the Itô integral over [0, t] is defined as the stochastic sum \int_0^t \phi_s \, dW_s = \sum_{i=1}^n c_i (W_{t_i} - W_{t_{i-1}}), evaluated at the left endpoint of each subinterval to preserve the adaptedness and avoid anticipation of future Brownian increments. For indicator functions specifically, such as \phi_s = \mathbf{1}_{[0, u]}(s) with u \leq t, the integral simplifies to W_t - W_u almost surely. This left-endpoint choice distinguishes the Itô integral from classical Riemann sums, as the quadratic variation of Brownian motion, [W, W]_t = t, introduces non-commutativity: interchanging the order of integration with respect to W and a smooth function yields an extra term proportional to the quadratic variation, preventing convergence in the Riemann-Stieltjes sense. The definition extends to the full class of square-integrable progressively measurable processes by completing the space of simple processes in the norm \|\phi\|^2 = \mathbb{E}\left[\int_0^t \phi_s^2 \, ds\right]. For a Cauchy sequence \{\phi^{(n)}\} of simple processes converging to \phi in this norm, the corresponding Itô integrals \int_0^t \phi^{(n)}_s \, dW_s form a Cauchy sequence in \mathbb{L}^2(P), converging in \mathbb{L}^2(P) to a limit denoted \int_0^t \phi_s \, dW_s. This extension is possible due to the Itô isometry, \mathbb{E}\left[\left( \int_0^t \phi_s \, dW_s \right)^2 \right] = \mathbb{E}\left[ \int_0^t \phi_s^2 \, ds \right], which holds for simple processes by independence of Brownian increments and extends continuously to the closure, ensuring the space of Itô integrals is complete (a Hilbert space). The Itô integral has zero expectation, \mathbb{E}\left[\int_0^t \phi_s \, dW_s\right] = 0, as each approximating sum has mean zero. Key properties include the martingale property: if \mathbb{E}\left[\int_0^\infty \phi_s^2 \, ds\right] < \infty, then M_t = \int_0^t \phi_s \, dW_s is a square-integrable martingale with respect to (\mathcal{F}_t), as the increments are orthogonal to the past filtration. More generally, it is a local martingale. The map \phi \mapsto \int_0^t \phi_s \, dW_s is continuous from the space of integrands to \mathbb{L}^2(P), and the resulting process has continuous paths almost surely. While the construction is specific to Brownian motion, the Itô integral extends to integration with respect to semimartingales via a decomposition into a continuous local martingale part (integrated against like Brownian motion) and a finite-variation part (via Stieltjes integration), though the latter is not derived here.

Stratonovich integral

The Stratonovich integral, introduced by Ruslan Stratonovich in the context of stochastic equations driven by random noise, provides a symmetric interpretation of stochastic integration that contrasts with the forward-looking Itô integral. It is particularly suited for applications where classical calculus rules are desirable, and is defined for progressively measurable integrand processes \phi with respect to a semimartingale integrator X, such as Brownian motion W. The integral \int_0^t \phi_s \circ dX_s is constructed as the limit in probability (or pathwise under suitable conditions) of Riemann-Stieltjes sums using evaluations at midpoints of partition intervals. Specifically, for a partition $0 = t_0 < t_1 < \cdots < t_n = t of [0, t] with mesh size approaching zero, the approximating sums are \sum_{i=0}^{n-1} \phi_{\frac{t_i + t_{i+1}}{2}} (X_{t_{i+1}} - X_{t_i}), where \phi is evaluated at the midpoint \frac{t_i + t_{i+1}}{2}. This midpoint evaluation yields the Stratonovich integral as the limit, first defined for simple step functions and then extended by density arguments to square-integrable or more general adapted processes via approximation in the sup norm or quadratic variation. Unlike the Itô integral, the Stratonovich integral does not generally produce an L^2-martingale but instead defines a semimartingale, allowing integration against processes of finite variation or with controlled quadratic variation. A key property of the Stratonovich integral is that it satisfies the ordinary chain rule of calculus, enabling straightforward application of transformation formulas without additional correction terms, much like in deterministic integration. For instance, the integral \int_0^t W_s \circ dW_s = \frac{1}{2} W_t^2, mirroring the classical result. This feature makes it valuable in contexts requiring intuitive calculus manipulations, such as deriving stochastic differential equations (SDEs) with physical interpretability. The Stratonovich integral arises naturally in physical and engineering models through the Wong–Zakai approximation theorem, which shows that when Brownian motion is approximated by smooth paths (e.g., piecewise linear or mollified versions), the corresponding ordinary Riemann–Stieltjes integrals converge to the Stratonovich integral as the approximation granularity increases. This convergence holds under mild conditions on the driving noise and integrand, justifying its use in stochastic averaging and systems where noise is modeled as a limit of correlated, smooth perturbations, common in mechanics and signal processing. In relation to the Itô integral, the Stratonovich integral can be expressed as \int_0^t \phi_s \circ dW_s = \int_0^t \phi_s \, dW_s + \frac{1}{2} \langle \phi, W \rangle_t, where \langle \phi, W \rangle_t denotes the quadratic covariation process between \phi and W. For continuous semimartingales, this integral is uniquely defined pathwise, independent of the choice of approximating sequence, provided the quadratic variation remains finite.

Properties and relations

Both the Itô integral and the Stratonovich integral share several fundamental properties as stochastic integrals with respect to Brownian motion. They are both semimartingales, ensuring that they can serve as integrators in further stochastic integrations, and they exhibit continuity in probability, meaning that the processes converge in probability as the time partition refines. Additionally, both integrals are defined for progressively measurable integrand processes, which allows adaptation to the underlying filtration generated by the Brownian motion. A key property specific to the Itô integral is its quadratic variation. For an Itô integral process I_t = \int_0^t \phi_s \, dW_s, where \phi is a progressively measurable integrand satisfying \mathbb{E}\left[\int_0^t \phi_s^2 \, ds\right] < \infty and W is a standard Brownian motion, the quadratic variation is given by [I]_t = \int_0^t \phi_s^2 \, ds. This follows from the definition of quadratic variation for martingales and the Itô isometry, which equates the expected squared increment of the integral to the integral of the squared integrand. The Itô and Stratonovich integrals are related through a precise conversion formula derived from their differing Riemann sum approximations. The Stratonovich integral \int_0^t \phi_s \circ dW_s is the limit of midpoint Riemann sums, while the Itô integral \int_0^t \phi_s \, dW_s uses left-endpoint sums. The relation is \int_0^t \phi_s \circ dW_s = \int_0^t \phi_s \, dW_s + \frac{1}{2} [\phi, W]_t, where [\phi, W]_t is the quadratic covariation between \phi and W. This covariation captures the second-order interaction due to the non-zero quadratic variation of Brownian motion, [W]_t = t. For \phi an Itô process d\phi_t = \mu_t \, dt + \sigma_t \, dW_t, the covariation is [\phi, W]_t = \int_0^t \sigma_s \, ds. In specific cases, such as when the diffusion coefficient \sigma_t = \phi_t (e.g., in geometric Brownian motion), this becomes \frac{1}{2} \int_0^t \phi_s \, ds. The formula follows from the definitions and properties of quadratic covariation, with the correction arising as the mesh of the partition approaches zero. The choice between Itô and Stratonovich interpretations often depends on the application domain. In mathematical finance, the Itô integral is preferred because it preserves the martingale property essential for arbitrage-free pricing and risk-neutral valuation, aligning with the non-anticipating nature of market information. In contrast, the Stratonovich integral is favored in physics and engineering for its geometric interpretation, where it satisfies the ordinary chain rule, facilitating modeling of physical systems like diffusion processes with multiplicative noise. These integrals generalize to semimartingales beyond Brownian motion. The Stratonovich integral extends using the continuous part of the predictable quadratic covariation \langle X, Y \rangle^c_t, defined as \int_0^t \phi_s \circ dX_s = \int_0^t \phi_s \, dX_s + \frac{1}{2} \langle \phi, X \rangle^c_t, where the conversion mirrors the Brownian case but accounts for the continuous finite variation component. This framework, introduced by Meyer, unifies the theory for processes with jumps while preserving chain rule properties for the continuous part. A notable relation is provided by the Wong-Zakai theorem, which states that approximating Brownian motion by smooth paths (e.g., piecewise linear interpolations) and computing ordinary Riemann-Stieltjes integrals yields convergence to the Stratonovich integral in the limit of refinement. This justifies the Stratonovich interpretation as a natural extension of deterministic calculus to noisy paths. Finally, the Itô and Stratonovich integrals coincide when the integrand \phi is deterministic, as the covariation [\phi, W]_t = 0, eliminating the correction term. In stochastic differential equations, this equivalence holds only in such cases; otherwise, the interpretations differ by a diffusion-induced drift adjustment of \frac{1}{2} \sigma \sigma' in the Itô form, affecting long-term behavior like stationary distributions.

Stochastic Differential Equations

Definition and solutions

Stochastic differential equations (SDEs) provide a framework for modeling systems influenced by random noise, extending ordinary differential equations by incorporating stochastic integrals. In the Itô form, an SDE is expressed as dX_t = b(t, X_t) \, dt + \sigma(t, X_t) \, dW_t, where X_t is the state process, b: \mathbb{R}_+ \times \mathbb{R}^d \to \mathbb{R}^d is the drift coefficient, \sigma: \mathbb{R}_+ \times \mathbb{R}^d \to \mathbb{R}^{d \times m} is the diffusion coefficient, and W_t is an m-dimensional Brownian motion. This differential notation corresponds to the integral equation X_t = X_0 + \int_0^t b(s, X_s) \, ds + \int_0^t \sigma(s, X_s) \, dW_s, where the second integral is an Itô stochastic integral. In contrast, the Stratonovich form uses symmetric integrals, denoted by \circ dW_t, which arise naturally in some physical derivations but require conversion to Itô form for standard stochastic calculus tools. Solutions to SDEs are classified as strong or weak. A strong solution is a process X = (X_t)_{t \geq 0} adapted to the filtration generated by the driving Brownian motion W, satisfying the integral equation pathwise almost surely on a given probability space. A weak solution, however, exists on some probability space with a Brownian motion \tilde{W} (possibly different from W) such that X and \tilde{W} satisfy the integral equation, with the law of X matching that required by the equation; every strong solution is weak, but not conversely. For linear SDEs of the form dX_t = (a(t) X_t + c(t)) \, dt + (B(t) X_t + D(t)) \, dW_t, explicit solutions can be obtained using an integrating factor analogous to the deterministic case, yielding X_t = \Phi(t) \left( X_0 + \int_0^t \Phi(s)^{-1} (c(s) \, ds + D(s) \, dW_s) \right), where \Phi solves the associated homogeneous equation. A prominent example is geometric Brownian motion, dS_t = \mu S_t \, dt + \sigma S_t \, dW_t, whose solution is S_t = S_0 \exp\left( \left( \mu - \frac{\sigma^2}{2} \right) t + \sigma W_t \right), widely used in modeling asset prices. Existence and uniqueness of solutions rely on conditions on the coefficients. Under global Lipschitz continuity—i.e., |b(t,x) - b(t,y)| + |\sigma(t,x) - \sigma(t,y)| \leq K |x - y| for some K > 0 and all t, x, y—along with linear growth bounds, Picard iteration establishes the existence of a unique strong solution via successive approximations converging in appropriate norms. These Lipschitz conditions ensure pathwise uniqueness, implying strong existence from weak existence via Yamada-Watanabe theorems. Solutions to SDEs with time-homogeneous coefficients b(t,x) = b(x) and \sigma(t,x) = \sigma(x) possess the Markov property: the future distribution of X_t given the past depends only on the current state X_s for s < t. For numerical approximation, the Euler-Maruyama scheme discretizes the SDE on a grid $0 = t_0 < t_1 < \cdots < t_N = T as Y_{n+1} = Y_n + b(t_n, Y_n) \Delta t_n + \sigma(t_n, Y_n) \Delta W_n, where \Delta t_n = t_{n+1} - t_n and \Delta W_n = W_{t_{n+1}} - W_{t_n}, converging weakly to the true solution under Lipschitz and growth conditions.

Itô's lemma

Itô's lemma provides the stochastic chain rule for functions of Itô processes, accounting for the quadratic variation inherent in stochastic differentials, unlike the classical chain rule for deterministic functions. Consider a twice continuously differentiable function f(t, x) and an Itô process X_t satisfying the stochastic differential equation (SDE) dX_t = \mu(t, X_t) \, dt + \sigma(t, X_t) \, dW_t, where W_t is a standard Brownian motion. Then, the differential of Y_t = f(t, X_t) is given by dY_t = \left( \frac{\partial f}{\partial t}(t, X_t) + \mu(t, X_t) \frac{\partial f}{\partial x}(t, X_t) + \frac{1}{2} \sigma^2(t, X_t) \frac{\partial^2 f}{\partial x^2}(t, X_t) \right) dt + \sigma(t, X_t) \frac{\partial f}{\partial x}(t, X_t) \, dW_t. This formula arises from a second-order Taylor expansion of f, truncated at higher-order terms that vanish in the limit. Specifically, df(t, X_t) = f_t \, dt + f_x \, dX_t + \frac{1}{2} f_{xx} (dX_t)^2 + o(dt). Substituting dX_t = \mu \, dt + \sigma \, dW_t yields (dX_t)^2 = \sigma^2 (dW_t)^2 + 2\mu\sigma \, dt \, dW_t + \mu^2 (dt)^2. The multiplication rules for stochastic differentials simplify this to (dX_t)^2 = \sigma^2 \, dt, since (dW_t)^2 = dt, dt \, dW_t = 0, and (dt)^2 = 0, with the latter following from the quadratic variation of Brownian motion being \langle W \rangle_t = t. A formal proof relies on the definition of the Itô integral and stochastic integration by parts. For simple processes, the result holds by direct computation; extension to general cases uses approximation by smooth functions and limits in probability. In the multidimensional setting, let X_t = (X_t^1, \dots, X_t^d) be a vector Itô process with dX_t^i = \mu^i(t, X_t) \, dt + \sum_{k=1}^m \sigma^{i k}(t, X_t) \, dW_t^k, where W_t^k are independent Brownian motions. For f(t, x) with continuous second partial derivatives, Itô's lemma states dY_t = \left( f_t + \sum_{i=1}^d \mu^i f_{x_i} + \frac{1}{2} \sum_{i,j=1}^d \sum_{k=1}^m \sigma^{i k} \sigma^{j k} f_{x_i x_j} \right) dt + \sum_{i=1}^d \sum_{k=1}^m \sigma^{i k} f_{x_i} \, dW_t^k, where the second-order term incorporates the covariation matrix of the diffusion components. For general semimartingales X and Y, possibly discontinuous, the full Itô formula for f \in C^{1,2} is f(t, X_t) - f(0, X_0) = \int_0^t f_s(s, X_{s-}) \, ds + \int_0^t f_x(s, X_{s-}) \, dX_s^c + \frac{1}{2} \int_0^t f_{xx}(s, X_{s-}) \, d\langle X^c \rangle_s + \sum_{0 < s \leq t} \left[ f(s, X_s) - f(s, X_{s-}) - f_x(s, X_{s-}) \Delta X_s \right], with the continuous parts using covariation \langle X^c, Y^c \rangle for vector processes, and the jump term capturing discontinuities via \Delta X_s = X_s - X_{s-}. This extends the diffusion case to include finite variation and jump components. In contrast, the Stratonovich integral, defined as \int H \circ dX = \int H \, dX + \frac{1}{2} \langle H, X \rangle, yields the classical chain rule: df(t, X_t) = f_t \, dt + f_x \circ dX_t. The Itô-Stratonovich conversion adjusts for the quadratic covariation term. Itô's lemma is essential for deriving solutions to SDEs and applications such as derivative pricing in mathematical finance.

Existence and uniqueness

The existence and uniqueness of solutions to stochastic differential equations (SDEs) of Itô type are established under conditions analogous to those in the deterministic Picard-Lindelöf theorem, adapted to account for the stochastic nature of the driving noise. Specifically, suppose the drift coefficient b(t, x) and diffusion coefficient \sigma(t, x) are locally Lipschitz continuous in x uniformly in t and satisfy the linear growth condition |b(t, x)| + |\sigma(t, x)| \leq K(1 + |x|) for some constant K > 0. Then, for a given initial condition X_0 independent of the Brownian motion W, there exists a unique strong solution to the SDE dX_t = b(t, X_t) dt + \sigma(t, X_t) dW_t on the interval [0, \tau), where \tau is the explosion time, which is almost surely positive. This result, originally due to Itô, relies on Picard iteration applied to the integral form of the SDE, leveraging the contraction properties in appropriate Banach spaces of stochastic processes. If the coefficients b and \sigma are globally Lipschitz continuous in x (i.e., |b(t, x) - b(t, y)| + |\sigma(t, x) - \sigma(t, y)| \leq K |x - y|) and satisfy the linear growth condition, the solution exists and is unique on the entire interval [0, \infty) with probability 1, and the explosion time \tau = \infty almost surely. These global conditions prevent finite-time explosions, ensuring the solution remains well-defined without reaching infinity in finite time. In contrast, violations of linear growth can lead to explosions; for instance, SDEs with superlinear growth in the coefficients may have solutions that explode in finite time with positive probability. For cases where strong existence fails, the Yamada-Watanabe theorem provides a bridge between weak and strong solutions: if weak existence holds (i.e., there exists a probability space with a Brownian motion and a process satisfying the SDE in distribution) and pathwise uniqueness holds (i.e., any two solutions driven by the same Brownian motion coincide almost surely), then there exists a unique strong solution. This result, which generalizes earlier work on martingale problems, underscores the role of pathwise uniqueness in upgrading weak solutions to strong ones. A classic counterexample illustrating the necessity of suitable conditions for strong uniqueness is Tanaka's SDE, dX_t = \operatorname{sign}(X_t) \, dW_t, \quad X_0 = 0, which admits weak solutions (e.g., X_t = |W_t| in law under a suitable measure change) but no strong solution, as pathwise uniqueness fails—different versions of the absolute value process can be constructed on the same probability space without coinciding. Weak existence can often be established even without Lipschitz conditions on the diffusion coefficient using the Girsanov theorem, which allows construction of a weak solution by starting with a Brownian motion (satisfying a simpler SDE with zero drift) and changing the probability measure via an exponential martingale to incorporate the desired drift, provided the Novikov condition or similar integrability holds. Under the standard global Lipschitz and linear growth assumptions, numerical approximations like the Euler-Maruyama scheme converge strongly to the true solution with order $1/2, meaning the expected error satisfies \mathbb{E}[|X_T - X_T^n|] \leq C h^{1/2} for time step h, where X_T^n is the numerical approximation at final time T. For Stratonovich SDEs, which arise naturally in physical applications due to their chain rule properties, existence and uniqueness theorems follow similar lines but are stated in terms of the Stratonovich coefficients. If the Stratonovich drift b and diffusion \sigma satisfy global Lipschitz continuity and linear growth (adjusted for the symmetric integral interpretation, often via conversion to an equivalent Itô SDE with an additional drift term \frac{1}{2} \sum_i \partial_i \sigma_{i j} \sigma_{i j}), a unique strong global solution exists on [0, \infty). These conditions ensure the Stratonovich integral is well-defined and the resulting process behaves analogously to the Itô case under the specified regularity.

Applications

Mathematical finance

In mathematical finance, stochastic calculus provides the foundational tools for modeling asset prices and derivatives under uncertainty. The Black-Scholes model assumes that the stock price S_t follows the stochastic differential equation (SDE) dS_t = \mu S_t \, dt + \sigma S_t \, dW_t, where \mu is the drift, \sigma is the volatility, and W_t is a standard Brownian motion. This model facilitates the pricing of European call options through the risk-neutral measure, obtained via Girsanov's theorem, which changes the probability measure to eliminate the drift's risk premium. The explicit solution to this SDE is geometric Brownian motion. Applying Itô's lemma to the option value V(S_t, t) yields the Black-Scholes partial differential equation (PDE) \frac{\partial V}{\partial t} + \frac{1}{2} \sigma^2 S^2 \frac{\partial^2 V}{\partial S^2} + r S \frac{\partial V}{\partial S} - r V = 0, where r is the risk-free rate, under the risk-neutral dynamics. Solving this PDE gives the 1973 Black-Scholes formula for a European call option: V = S N(d_1) - K e^{-rT} N(d_2), with d_1 = \frac{\ln(S/K) + (r + \sigma^2/2)T}{\sigma \sqrt{T}} and d_2 = d_1 - \sigma \sqrt{T}, where N(\cdot) is the cumulative distribution function of the standard normal, K is the strike price, and T is the time to maturity. Hedging in this framework involves constructing a delta-hedging portfolio, where the hedge ratio is \Delta = \partial V / \partial S, and the portfolio remains self-financing through stochastic integrals that replicate the option payoff without arbitrage. For exotic options, such as barrier options that activate or deactivate upon the underlying price hitting a barrier, pricing often relies on reflected Brownian motion to account for the boundary conditions in the risk-neutral framework. Extensions of the Black-Scholes model incorporate more realistic dynamics; for instance, Merton's 1976 jump-diffusion model adds Poisson jumps to the SDE to capture sudden price discontinuities.90022-6) Similarly, the Heston model introduces stochastic volatility via an additional SDE for the variance process, allowing correlation between asset returns and volatility shocks. Risk management in these models employs Value-at-Risk (VaR), which quantifies potential losses at a given confidence level, computed via Monte Carlo simulations of the underlying SDEs to generate future price paths and estimate tail distributions.

Physics and engineering

Stochastic calculus provides essential tools for modeling systems in physics and engineering where random fluctuations, such as thermal noise, play a significant role. In physics, it enables the description of particle dynamics under irregular forces, while in engineering, it supports the design of robust systems for estimation and control amid uncertainty. These applications often involve stochastic differential equations (SDEs) to capture both deterministic evolution and diffusive noise. A foundational example is the Langevin equation, which models the motion of a Brownian particle subject to friction and random kicks from the surrounding medium. For a particle of mass m, velocity v, friction coefficient \gamma, Boltzmann constant k, temperature T, and Wiener process W, the equation is given by m \, dv = -\gamma v \, dt + \sqrt{2 \gamma k T} \, dW, where the noise term represents Gaussian white noise with variance tied to temperature. In physical contexts, the Stratonovich interpretation of this SDE is preferred because it preserves the chain rule from ordinary calculus and aligns with the continuous limit of microscopic collisions, ensuring consistency with equilibrium thermodynamics. From such SDEs, the Fokker-Planck equation describes the evolution of the probability density p(x, t) of the system's state x. For an Itô SDE dx = \mu(x, t) \, dt + \sigma(x, t) \, dW, the corresponding Fokker-Planck equation is \frac{\partial p}{\partial t} = -\frac{\partial}{\partial x} \left( \mu p \right) + \frac{1}{2} \frac{\partial^2}{\partial x^2} \left( \sigma^2 p \right), derived via Itô's lemma applied to the density evolution. This equation quantifies how noise diffuses probability mass while drift terms shift it, providing insights into stationary distributions and relaxation times in noisy physical systems. The fluctuation-dissipation theorem connects the strength of these noise fluctuations to dissipative processes and temperature, ensuring that equilibrium is reached with the correct thermal distribution. In the Langevin framework, the noise amplitude \sqrt{2 \gamma k T} directly embodies this relation, where the diffusion coefficient scales with temperature to balance frictional damping, as derived from linear response theory. In quantum mechanics, stochastic unraveling techniques decompose the Schrödinger equation into ensembles of stochastic trajectories, facilitating simulations of open quantum systems interacting with noisy environments. These methods represent the density matrix evolution as an average over nonlinear stochastic Schrödinger equations driven by quantum noise processes, enabling efficient numerical treatment of decoherence and measurement effects. Engineering applications leverage stochastic calculus for state estimation in noisy systems, exemplified by the Kalman-Bucy filter. For a linear system with state X, observation Y, dynamics matrix A, observation matrix C, and gain K, the filter update is d\hat{X} = A \hat{X} \, dt + K \left( dY - C \hat{X} \, dt \right), where stochastic integrals handle the noisy measurements, minimizing estimation error variance in real-time applications like navigation and signal processing. In control theory, stochastic optimal control addresses decision-making under uncertainty, with the Hamilton-Jacobi-Bellman (HJB) equation providing the value function for optimality. For Itô SDEs with control u, the HJB equation arises from the dynamic programming principle and Itô's lemma, balancing infinitesimal generators of the controlled process to minimize expected costs. This framework is crucial for designing controllers in stochastic environments, such as robotic systems with sensor noise. Representative examples include the stochastic logistic equation for population dynamics, dN = r N (1 - N/K) \, dt + \sigma N \, dW, which models growth limited by carrying capacity K amid environmental fluctuations, leading to quasi-stationary distributions around the deterministic equilibrium. In electrical engineering, thermal noise in circuits is captured by SDEs for resistor-inductor networks, where Johnson-Nyquist noise drives voltage fluctuations proportional to \sqrt{4 k T R \Delta f}, analyzed via stochastic integrals to predict signal integrity.

References

  1. [1]
    [PDF] A Brief Introduction to Stochastic Calculus - Columbia University
    These notes provide a very brief introduction to stochastic calculus, the branch of mathematics that is most identified with financial engineering and ...
  2. [2]
    [PDF] Introduction to Stochastic Calculus - Duke Mathematics Department
    Jan 8, 2020 · Chapter 1. Introduction. 5. 1. Motivations. 5. 2. Outline For a Course. 6. Chapter 2. Probabilistic Background.
  3. [3]
    [PDF] A short history of stochastic integration and mathematical finance
    Abstract: We present a history of the development of the theory of Stochastic. Integration, starting from its roots with Brownian motion, up to the introduc ...
  4. [4]
    [PDF] Stochastic Calculus, Filtering, and Stochastic Control - Princeton Math
    May 29, 2007 · As the name. suggests, stochastic calculus provides a mathematical foundation for the treatment.
  5. [5]
    [PDF] Stochastic Calculus Notes, Lecture 1 1 Overview
    Jan 19, 2007 · 1.1. Introduction: The term stochastic means “random”. Because it usually occurs together with “process” (stochastic process), it makes ...
  6. [6]
    [PDF] 1 The Definition of a Stochastic Process - University of Regina
    A stochastic process is simply a collection of random variables indexed by time. It will be useful to consider separately the cases of discrete time and ...
  7. [7]
    [PDF] General theory of stochastic processes - Uni Ulm
    A stochastic process is a random function appearing as a result of a random experiment. Definition 1.1. 1.
  8. [8]
    [PDF] Chapter 8 Markov Processes - NTNU
    A Markov process is a continuous time stochastic process with the Markov property, where the future is independent of the past when the present state is known.
  9. [9]
    [PDF] Basics of Lévy processes∗ - Duke Economics
    Jun 9, 2012 · Increments play a crucial role in the formal definition of a Lévy process. Definition 1 Lévy process. A c`adl`ag stochastic process Y = {Yt}t≥0 ...
  10. [10]
    [PDF] Stat 8112 Lecture Notes Stationary Stochastic Processes
    Apr 29, 2012 · A stochastic process having second moments is weakly stationary or sec- ond order stationary if the expectation of Xn is the same for all ...
  11. [11]
    [PDF] lévy processes, stable processes, and subordinators
    A continuous–time process {Xt = X (t )}t 0 with values in Rd (or, more generally, in an abelian topological group G ) is called a Lévy process if (1) its ...
  12. [12]
    [PDF] Chapter 2 - POISSON PROCESSES - MIT OpenCourseWare
    A Poisson process is a simple and widely used stochastic process for modeling the times at which arrivals enter a system.
  13. [13]
    [PDF] INTRODUCTION TO GAUSSIAN PROCESSES Definition 1.1. A ...
    DEFINITIONS AND EXAMPLES. Definition 1.1. A Gaussian process {Xt }t ∈T indexed by a set T is a family of (real-valued) random variables Xt , all defined on ...
  14. [14]
    Appendix B: Elements of Stochastic Processes Theory
    These paths are called cadlag, which is a French acronym for continu á droite, limite á gauche which means “right-continuous with left limit”. The jump at t is ...
  15. [15]
    [PDF] 9 Brownian Motion
    The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums Sn/pn of independent, ...
  16. [16]
    [PDF] An Introduction to Stochastic Processes in Continuous Time
    Loosely speaking, a stochastic process is a phenomenon that can be thought of as evolving in time in a random manner. Common examples are the location of a ...
  17. [17]
    111 years of Brownian motion - PMC - PubMed Central - NIH
    In 1827, the botanist Robert Brown systematically demonstrated that any small particle suspended in a fluid has such characteristics, even an inorganic grain.
  18. [18]
    [PDF] BROWNIAN MOTION 1.1. Wiener Process
    A standard (one-dimensional) Wiener process (also called Brownian motion) is a stochastic process {Wt}t≥0+ indexed by nonnegative real numbers t with the ...
  19. [19]
    [PDF] Brownian Motion
    Suppose that Brownian motion exists, that is, suppose that on some probability space (Ω,F,P) there is a centered Gaussian process {Wt}t∈[0,1] with covariance.
  20. [20]
    [PDF] A guide to Brownian motion and related stochastic processes
    This guide covers the mathematical theory of Brownian motion and related stochastic processes, including its relation to partial differential equations.
  21. [21]
    [PDF] Browninan Motion. Lecture Notes. | Kolokoltsov - University of Warwick
    1. Review of measure and probability. 2. Brownian motion: construction via Hilbert space methods. 3. The construction of BM via Kolmogorov's continuity theorem.
  22. [22]
    Brownian Motion and Stochastic Calculus - SpringerLink
    In stock Free deliverySoftcover Book USD 64.95 ; Reviews. Second Edition. I. Karatzas and S.E. Shreve. Brownian Motion and Stochastic Calculus. "A valuable book for every graduate ...
  23. [23]
    Stochastic Integration and Differential Equations - SpringerLink
    Book Title: Stochastic Integration and Differential Equations. Authors: Philip E. Protter. Series Title: Stochastic Modelling and Applied Probability. DOI ...
  24. [24]
    [PDF] an introduction to stochastic calculus - UChicago Math
    A stochastic process {Xt}t∈T is a collection of random variables defined on the same probability space indexed by time. Remark 2.11. T is most often N or R>0, ...
  25. [25]
    A New Representation for Stochastic Integrals and Equations
    Symmetric integrals of the Stieltjes type for an arbitrary continuous function and which are determinate versions of the Stratonovich stochastic integrals are ...Missing: original paper
  26. [26]
    On the Convergence of Ordinary Integrals to Stochastic Integrals
    October, 1965 On the Convergence of Ordinary Integrals to Stochastic Integrals. Eugene Wong, Moshe Zakai · DOWNLOAD PDF + SAVE TO MY LIBRARY. Ann. Math.
  27. [27]
  28. [28]
  29. [29]
    [PDF] Brownian Motion and Stochastic Calculus
    Brownian motion is a continuous-time stochastic process having stationary ... Property 3 in Definition 4.1 shows that Bt −Bs is independent of all Brownian.
  30. [30]
    [PDF] Notes on the Itô Calculus
    Nov 14, 2016 · In particular, the integrals may not have finite first moments; hence they are no longer necessarily martingales; and there is no Itô isometry.
  31. [31]
    [PDF] Stochastic Differential Equations
    Mar 9, 2020 · stochastic differential equations see Gard (1988), Chapter 4. Page ... Recall from Chapter 3 that this allows us to define the Itô integral.
  32. [32]
    ito versus stratonovich: 30 years later - ResearchGate
    Apr 27, 2019 · The Itô versus Stratonovich controversy, about the "correct" calculus to use for integration of Langevin equations, was settled to general ...
  33. [33]
    [PDF] Stratonovich Stochastic Differential Equations Driven by General ...
    The "integral" in the equation is a new type of Stratonovich stochastic integral with respect to a semimartingale Z with jumps. (Our integral is different from ...
  34. [34]
    [PDF] Applied Stochastic Differential Equations
    May 3, 2019 · Protter, P. E. 2013. Stochastic Integration and Differential Equations. Second edn. Berlin: Springer. (Cited on pages 55 and 56.) Page 296. c ...<|separator|>
  35. [35]
    [PDF] Stochastic Differential Equations
    May 30, 2012 · What distinguishes a strong solution from a weak solution is the requirement that it be adapted to the completion of the minimal filtration.
  36. [36]
    Existence of Solutions to Stochastic Differential Equations
    Feb 10, 2010 · The uniqueness theorem for SDEs with Lipschitz ... There are various standard ways of proving existence and uniqueness (such as Picard iteration) ...
  37. [37]
    [PDF] Lecture 9: Numerically solving SDEs
    For example, there are analytical solutions available for the Ornstein-Uhlenbeck process, Geometric Brownian motion, and several other equations; these all.
  38. [38]
    The Pricing of Options and Corporate Liabilities
    The paper derives a valuation formula for options, applicable to corporate liabilities like bonds, and the discount for default.Missing: URL | Show results with:URL
  39. [39]
    [PDF] Lecture 8: The Cameron-Martin Formula and Barrier Options
    the simple Black-Scholes model, the share price process behaves as a geometric Brownian motion under the risk-neutral measure, then the time at which the option ...Missing: calculus | Show results with:calculus
  40. [40]
    Ito, Stratonovich, or a different stochastic interpretation | Phys. Rev. E
    Dec 12, 2011 · Recent experiments on Brownian colloidal particles have been studied theoretically in terms of overdamped Langevin equations with ...
  41. [41]
    [PDF] On the interpretation of Stratonovich calculus - Oregon State University
    property in the multiplicative noise term of the stochastic Langevin equation, and hence manifests itself in the drift term of the associated Fokker–Planck ...
  42. [42]
    [PDF] FOKKER-PLANCK- AND LANGEVIN EQUATION
    Dec 10, 2021 · We start out by deriving the Fokker-Planck equation from the general Master equation of time- and state-continuous Markov processes. In section ...
  43. [43]
    [PDF] The fluctuation-dissipation theorem
    Nov 27, 2023 · The fluctuation-dissipation theorem can thus be used in two ways: it can predict the characteristics of the fluctuation or the noise intrinsic ...Missing: strength calculus
  44. [44]
    Time-local unraveling of non-Markovian stochastic Schrödinger ...
    Sep 19, 2017 · Non-Markovian stochastic Schrödinger equations (NMSSE) are important tools in quantum mechanics, from the theory of open systems to foundations.
  45. [45]
    Simulation of Quantum Dynamics Based on the Quantum Stochastic ...
    It can be used for a wide range of open quantum systems to solve the master equation by unraveling the density operator evolution into individual stochastic ...
  46. [46]
    [PDF] New Results in Linear Filtering and Prediction Theory1 - Duke People
    These results were then generalized by Bucy [10], who found explicit rela- tionships between the optimal weighting functions and the error variances; he also ...<|separator|>
  47. [47]
    Stochastic Equations with Delay: Optimal Control via BSDEs and ...
    In this paper we study the fully nonlinear stochastic Hamilton--Jacobi--Bellman (HJB) equation for the optimal stochastic control problem of stochastic ...<|separator|>
  48. [48]
    [1506.01137] Stochastic dynamics and logistic population growth
    Jun 3, 2015 · We investigate analytically and numerically the simplest possible microscopic scenarios that give rise to the logistic equation in the deterministic mean-field ...
  49. [49]
    Stochastic Thermodynamics of Nonlinear Electronic Circuits
    Sep 22, 2021 · We provide a general theory of nonlinear electronic circuits subjected to thermal noise. The devices constituting the circuit can have ...Missing: calculus | Show results with:calculus