Fact-checked by Grok 2 weeks ago

Stochastic volatility

Stochastic volatility refers to a class of models in and that treat the volatility of asset returns as a , allowing it to vary randomly over time in response to unpredictable shocks, in contrast to the constant volatility assumption of the Black-Scholes model. These models capture key empirical features of financial markets, such as —where periods of high volatility tend to follow one another—and the leverage effect, where negative returns are associated with increased future volatility. The origins of stochastic volatility modeling trace back to Clark (1973), who proposed the mixture of distributions hypothesis, suggesting that observed return arises from an underlying subordinating a . This idea was advanced by (1982), who introduced the first explicit discrete-time stochastic volatility framework, modeling log returns as the product of a stochastic volatility component and independent innovations. In continuous time, and (1987) developed foundational diffusion-based models for option pricing, deriving series approximations for options when volatility follows a mean-reverting uncorrelated or correlated with the asset price. A landmark development came with the in 1993, which specifies variance as a Cox-Ingersoll-Ross square-root , enabling closed-form solutions for option prices via transforms while incorporating between asset returns and volatility changes to explain the volatility skew. Stochastic volatility models have since become integral to pricing, , and , improving upon generalized (GARCH) models by treating volatility as a latent process estimated via filtering techniques like . They particularly excel in replicating the implied volatility smile observed in option markets, especially following events like the 1987 stock market crash.

Introduction

Definition and Core Concepts

Stochastic volatility refers to a class of financial models in which the volatility of asset returns is treated as a random process that evolves stochastically over time, rather than being fixed or deterministic. In these models, volatility is latent and influenced by unobservable shocks, often modeled through its own separate from the asset . A foundational representation of asset behavior under stochastic volatility is given by the dS_t = \mu S_t \, dt + \sqrt{v_t} \, S_t \, dW_t^S, where S_t denotes the asset price at time t, \mu is the (drift), v_t is the instantaneous stochastic variance, and W_t^S is a standard driving the price innovations. The primary motivation for stochastic volatility models stems from the empirical inadequacies of the Black-Scholes-Merton framework, which posits constant volatility and thereby cannot account for observed patterns like the in option implied volatilities, periods of , or the leverage effect where negative returns coincide with increases in future volatility. For example, after the 1987 , the constant-volatility assumption led to systematic underpricing of out-of-the-money equity index puts, as implied volatilities varied systematically with strike prices and maturity. By introducing randomness in volatility, these models provide a more flexible structure to replicate real-world asset return behaviors and improve derivative pricing accuracy. Key core concepts in stochastic volatility include fat tails in the unconditional of returns, which emerge from the of a stochastic volatility and result in leptokurtic (heavy-tailed) densities that align better with historical than the Gaussian assumption of simpler models. Volatility persistence captures the clustering phenomenon, where shocks to exhibit positive , causing high-volatility episodes to persist longer than implied by independent innovations. Mean-reversion in the volatility is another essential feature, describing how variance levels fluctuate around a long-run , often modeled to prevent explosive or degenerate paths.

Historical Development

The roots of stochastic volatility modeling trace back to early empirical observations of financial time series behavior. In the 1960s, highlighted the non-normal distribution of asset returns and the presence of , where large price changes tend to follow large changes and small changes follow small ones, challenging the constant volatility assumption in early financial models. His 1967 analysis with Howard M. Taylor of stock price differences further reinforced these findings by documenting non-Gaussian distributions and persistent volatility patterns in returns. These insights laid the groundwork for recognizing time-varying volatility as a key feature of markets, though formal modeling emerged later. The 1970s and 1980s marked the emergence of stochastic volatility in options pricing, spurred by the limitations of the Black-Scholes model, which assumed constant volatility but faced empirical discrepancies in across strikes and maturities. Fischer Black's 1976 work on commodity options implicitly highlighted variations in , prompting extensions to account for stochasticity. By 1987, John Hull and Alan White introduced a continuous-time framework for pricing options on assets with stochastic volatility, incorporating correlation between asset returns and volatility changes to capture leverage effects. Early discrete-time approaches, such as Louis Scott's 1987 model treating variance as a log-normal , further advanced the field by providing theoretical foundations for option valuation under random volatility. The 1990s brought breakthroughs in tractable stochastic volatility models. Stein and Stein (1991) derived an analytic approximation for option prices using an Ornstein-Uhlenbeck process for volatility, enabling better fitting to observed smile patterns. Steven Heston's 1993 model achieved a semi-closed-form solution via Fourier transforms for affine s, particularly the square-root process, which became a cornerstone for pricing and hedging due to its mean-reverting properties and correlation features. Meanwhile, extensions of GARCH models to continuous time, as in Daniel Nelson's 1990 approximations, bridged discrete volatility forecasting with processes, enhancing empirical applicability. In the 2000s, focus shifted to capturing dynamics and practical implementations. The model, introduced by Patrick Hagan et al. in 2002, modeled forward rates and stochastic volatility with a parameter for effects, offering an asymptotic that excelled in fitting caps and swaptions. The and beyond introduced rough volatility paradigms, with Jim Gatheral, Thibault Jaisson, and Mathieu Rosenbaum's 2018 work demonstrating that log-volatility exhibits anti-persistent behavior akin to with below 0.5, improving fits to high-frequency data and option surfaces. Post-2020 developments have integrated for efficient calibration of these models, such as deep neural networks to map parameters to prices, reducing computational burdens in rough and affine frameworks.

Theoretical Foundations

Stochastic Differential Equations in Volatility Modeling

Stochastic volatility models describe the joint dynamics of an asset price S_t and its instantaneous variance v_t through a of coupled stochastic differential equations (SDEs). The general framework posits that the asset price follows a with volatility driven by the square root of the variance process, while the variance itself evolves as a mean-reverting . A canonical representation of this system is given by \begin{align} dS_t &= \mu S_t \, dt + \sqrt{v_t} S_t \, dW_t^S, \\ dv_t &= \kappa (\theta - v_t) \, dt + \xi \sqrt{v_t} \, dW_t^v, \end{align} where \mu is the drift rate of the asset, \kappa > 0 is the speed of mean reversion, \theta > 0 is the long-term mean variance, \xi > 0 is the volatility of volatility (vol-of-vol), and W_t^S and W_t^v are standard Brownian motions with correlation \rho, i.e., d\langle W^S, W^v \rangle_t = \rho \, dt. This formulation, which allows for stochastic fluctuations in volatility, was popularized in the context of option pricing by Heston (1993) and builds on earlier work introducing correlated stochastic volatility (Hull and White, 1987). The drift term \kappa (\theta - v_t) in the variance equation enforces mean reversion, pulling the variance toward its long-term level \theta at rate \kappa, a common assumption to capture the empirical tendency of volatility to revert to a stable over time. The diffusion term \xi \sqrt{v_t} introduces heteroskedasticity in the variance , reflecting the vol-of-vol \xi that governs the magnitude of volatility shocks. The \rho between the Brownian motions captures the leverage effect, where negative \rho implies that volatility tends to rise when asset prices fall, consistent with observed market behavior in equities. For the variance process to remain non-negative and ensure existence and uniqueness of solutions, certain restrictions are required. Specifically, the Feller $2\kappa\theta > \xi^2 guarantees that the process does not reach with positive probability, preventing negative values. This originates from the analysis of square-root diffusions and is crucial for the well-posedness of the model (Feller, 1951; , Ingersoll, and Ross, 1985). To simulate paths from these SDEs for pricing or , numerical schemes are employed. The Euler-Maruyama provides a first-order approximation, discretizing the variance process as v_{t + \Delta t} \approx v_t + \kappa (\theta - v_t) \Delta t + \xi \sqrt{v_t} \sqrt{\Delta t} \, Z, where Z \sim \mathcal{N}(0,1) is a standard normal random variable, with a similar scheme applied to the asset price incorporating the correlation via a bivariate normal draw. This scheme converges weakly to the true solution under standard Lipschitz and growth conditions on the coefficients, making it suitable for Monte Carlo simulations in stochastic volatility settings (Kloeden and Platen, 1992).

Moments and Properties of Volatility Processes

Stochastic volatility processes, such as those driven by mean-reverting diffusions, achieve stationarity through a long-run variance parameter θ, representing the equilibrium level to which volatility reverts over time. Mean-reversion, governed by a positive speed parameter κ, ensures ergodicity, meaning the process converges in distribution to an invariant gamma-like measure regardless of initial conditions, provided the Feller condition holds to maintain non-negativity. This ergodic property facilitates long-term predictability and stability in volatility dynamics, distinguishing stochastic volatility models from non-stationary alternatives. The moments of these processes provide key insights into their statistical behavior. For the variance process v_t in the , the unconditional mean is E[v_t] = θ, reflecting the long-run average level. The unconditional variance is given by Var(v_t) = ξ² θ / (2κ), where ξ denotes the volatility of ; this expression highlights how slower mean-reversion (smaller κ) amplifies variance, while higher ξ increases fluctuations around the mean. Higher-order moments, including , emerge from the parameter ρ between asset returns and volatility innovations; a negative ρ induces negative in the asset return distribution, contributing to asymmetric risk profiles in returns. Stochastic volatility inherently generates leptokurtosis in asset return distributions, exceeding the kurtosis of 3 in the Black-Scholes-Merton framework, which assumes constant volatility. This leptokurtosis arises from the time-varying nature of volatility, producing heavier tails that better capture extreme events observed in financial data, such as market crashes or booms. The fat-tailed return distributions result from the compounding effect of persistent volatility shocks on log-returns, leading to a non-Gaussian overall profile even when instantaneous shocks are normal. The effect, a hallmark of markets, is modeled through a negative ρ between price shocks and innovations, creating an inverse relationship: negative returns tend to coincide with increases in future , amplifying . This effect, empirically identified in stock data, stems from financial —declining firm value raises the , heightening perceived risk— and is explicitly incorporated in stochastic volatility frameworks to replicate observed return- asymmetries. In simulations of stochastic volatility paths, low values of the mean-reversion speed κ induce high persistence, manifesting as where high-volatility periods follow one another, and low-volatility regimes persist similarly. This path dependence underscores the memory in volatility processes, enabling realistic replication of empirical stylized facts like prolonged turbulence after shocks, without relying on deterministic rules.

Continuous-Time Models

Heston Model

The , proposed by Steven L. Heston in 1993, represents a foundational affine stochastic volatility framework for pricing derivative securities, particularly European options, by incorporating mean-reverting stochastic variance driven by a . Under the , the model specifies the dynamics of the underlying asset price S_t and its instantaneous variance v_t as follows: \begin{align} \frac{dS_t}{S_t} &= r \, dt + \sqrt{v_t} \, dW_t^S, \\ dv_t &= \kappa (\theta - v_t) \, dt + \xi \sqrt{v_t} \, dW_t^v, \end{align} where r is the , \kappa > 0 is the speed of mean reversion, \theta > 0 is the long-term variance level, \xi > 0 is the volatility of variance (vol-of-vol), and W_t^S and W_t^v are processes with \rho = \mathbb{E}[dW_t^S dW_t^v]. The variance process v_t follows a Cox-Ingersoll-Ross () diffusion, ensuring non-negativity under the Feller condition $2\kappa\theta \geq \xi^2. The affine structure of the model—arising from the linear dependence of the drift and diffusion terms on v_t—enables the conditional of the log-asset price to take the exponential-affine form \phi(u; v_0, \tau) = \exp\left( C(\tau, u) + D(\tau, u) v_0 + i u (r\tau + \ln S_0) \right), where \tau is the time to maturity. The functions C(\tau, u) and D(\tau, u) are derived by solving a system of equations (ODEs), specifically a for D and an auxiliary equation for C, which admit closed-form solutions involving complex logarithms. This explicit form facilitates efficient computation via inversion techniques, such as the Gil-Pelaez inversion or integration methods. Option pricing in the Heston model leverages this characteristic function to compute the price of a European call option with strike K as C(K) = S_0 \Pi_1 - K e^{-r\tau} \Pi_2, where \Pi_1 and \Pi_2 are risk-neutral probabilities obtained by inverting the characteristic function under the stock measure and a share measure, respectively, via contour integrals in the complex plane. These semi-closed-form expressions provide a significant advantage over simulation-based methods, allowing rapid pricing while capturing key empirical features of option markets, such as the volatility smile and skew, through the leverage effect induced by negative \rho and the curvature from \xi. However, a notable limitation is the potential for the square-root diffusion in v_t to reach the zero boundary when the Feller condition $2\kappa\theta < \xi^2 holds, which can introduce discontinuities in the characteristic function and numerical challenges in pricing, although this is mitigated in practice by the model's parameters often satisfying the condition.

Constant Elasticity of Variance (CEV) Model

The constant elasticity of variance (CEV) model extends the Black-Scholes-Merton framework by specifying the instantaneous volatility as a power function of the underlying asset price, thereby introducing a local volatility component that depends deterministically on the price level. Under the risk-neutral measure, the asset price S_t follows the stochastic differential equation (SDE) dS_t = r S_t \, dt + \sigma S_t^\beta \, dW_t, where r is the risk-free rate, \sigma > 0 is the volatility parameter, \beta \in \mathbb{R} is the elasticity parameter, and W_t is a standard Brownian motion. This formulation allows the model's volatility to vary with the asset price, blending elements of local volatility modeling while remaining a one-factor diffusion process. The elasticity parameter \beta governs the relationship between volatility and price, directly influencing the skewness of the implied volatility surface. When \beta = 1, the CEV model reduces to the of the Black-Scholes-Merton model with constant relative volatility. For \beta = 0.5, it corresponds to a square-root , akin to the Cox-Ingersoll-Ross model but applied to the asset price. Values of \beta < 1 induce an inverse relationship between volatility and price, whereby volatility increases as the asset price decreases; this captures the leverage effect in equity markets, where falling prices amplify volatility due to increased financial leverage. Lower \beta values produce steeper negative skewness in option prices, enabling the model to fit observed market dynamics more effectively than constant-volatility assumptions. Option pricing in the CEV model generally lacks closed-form solutions except for specific values of \beta (such as 0, 0.5, and 2), where prices can be expressed using the non-central chi-squared distribution. For general \beta, pricing relies on numerical methods, including partial differential equation (PDE) solvers or series expansions based on transition density functions derived from modified Bessel functions. Cox provided foundational expressions for these transition densities, facilitating computational approaches like finite difference methods or Fourier transforms for efficient valuation. In applications to equity options, the CEV model excels at reproducing the negative skew in implied volatility smiles observed in equity markets, particularly for short maturities, by leveraging the price-dependent volatility to generate higher option premiums for out-of-the-money puts when \beta < 1. This makes it suitable for pricing vanilla options and exotic derivatives in environments with pronounced downside risk. Empirical calibrations often yield \beta estimates around 0.5 to 0.8 for major equity indices, improving fit over the without requiring additional stochastic factors. Extensions of the CEV model incorporate stochastic volatility to create hybrid frameworks, multiplying the local CEV volatility by a separate mean-reverting stochastic process to better capture both skew and smile dynamics across maturities. These hybrid models, such as those combining CEV with a fast-mean-reverting for volatility, enable asymptotic approximations for option pricing and have been applied to variance swaps and credit derivatives.

SABR Volatility Model

The SABR model, an acronym for stochastic alpha beta rho, is a stochastic volatility framework specifically tailored to replicate the volatility smile in options on forwards, particularly in interest rate and foreign exchange derivatives. It describes the joint dynamics of a forward rate or price F_t and its instantaneous \sigma_t via the stochastic differential equations dF_t = \sigma_t F_t^\beta \, dW_t^F, d\sigma_t = \nu \sigma_t \, dW_t^\sigma, where \beta \in [0,1] governs the power-law dependence on F_t, \nu > 0 represents the volatility of , and the Wiener processes satisfy \langle dW_t^F, dW_t^\sigma \rangle = \rho \, dt with \rho \in [-1,1]. The model's parameters play distinct roles in shaping the surface: \beta determines the "backbone" slope, interpolating between normal (\beta = 0) and log-normal (\beta = 1) forward dynamics; \nu controls the smile's curvature by amplifying fluctuations; and \rho drives the , with negative values typically producing the downward-sloping smiles observed in rates markets. Initial values include the at-the-money \sigma_0 and forward f = F_0. These features enable the model to flexibly match empirical patterns across strikes and maturities. For practical option pricing, the model relies on an asymptotic expansion of the implied Black-Scholes volatility \sigma_B(K) for strike K, derived under small time-to-maturity \tau or low volatility-of-volatility assumptions: \sigma_B(K) \approx \frac{\sigma_0 \left(1 + \left[ \frac{(1-\beta)^2}{24} \log^2 \frac{f}{K} + \frac{(1-\beta)^4}{1920} \log^4 \frac{f}{K} \right] + \cdots \right)^{-1} \left( \frac{z}{x(z)} \right) \left( 1 + \left[ \frac{(1-\beta)^2 \alpha^2}{24 (f K)^{(1-\beta)}} + \frac{\rho \beta \nu \alpha}{4 (f K)^{(1-\beta)/2}} + \frac{2-3\rho^2}{24} \nu^2 \right] \tau + \cdots \right)}{(f K)^{(1-\beta)/2} \left(1 + \frac{(1-\beta)^2}{24} \log^2 \frac{f}{K} + \frac{(1-\beta)^4}{1920} \log^4 \frac{f}{K} + \cdots \right)}, where \alpha = \sigma_0, z = \frac{\nu}{\alpha} (f K)^{(1-\beta)/2} \log \frac{f}{K}, and x(z) = \log \left( \frac{\sqrt{1 - 2 \rho z + z^2} + z - \rho}{1 - \rho} \right). This closed-form , pioneered by Hagan et al., facilitates rapid and without solving the full PDE. Despite its , the Hagan exhibits limitations, including breakdowns at extreme strikes where it can yield negative implied volatilities or inconsistent densities, particularly for \beta < 1 and high \nu. To address such issues, especially in environments with negative rates requiring shifts in the forward process, extensions like the \lambda-SABR model introduce a displacement parameter \lambda to the forward dynamics, enhancing stability and arbitrage-freeness. In practice, the SABR model serves as a benchmark for interpolating volatility in FX options and interest rate caps/floors, where it efficiently captures market-observed skews and curvatures for and risk management.

3/2 Model

The 3/2 model is a continuous-time model designed to capture the dynamics of asset prices where the instantaneous variance follows a mean-reverting process with a power-law diffusion term that emphasizes explosive behavior at high volatility levels. The model's variance process v_t is governed by the stochastic differential equation () dv_t = \kappa v_t (\theta - v_t) \, dt + \xi v_t^{3/2} \, dW_t^v, where \kappa > 0 is the speed of mean reversion, \theta > 0 is the long-term mean variance, \xi > 0 is the volatility of variance, and W_t^v is a correlated with the asset price process's Brownian motion with correlation \rho. This SDE represents an inverse Cox-Ingersoll-Ross () process, as the reciprocal y_t = 1/v_t follows a standard CIR dynamics, ensuring positivity of v_t under appropriate parameter restrictions. The model is closely related to the through a Lamperti transform applied to the variance process, which duality highlights its ability to generate increasing skews, in contrast to the Heston's typical decreasing . Specifically, the transform y_t = 1/v_t converts the 3/2 variance dynamics into a process, allowing the model to inherit some analytical tractability while exhibiting more erratic volatility paths that align better with empirical observations of volatility explosions during market stress. A key feature of the \xi v_t^{3/2} term is its super-linear growth, which amplifies volatility fluctuations when v_t is large, leading to potential explosive behavior and upward-sloping skews for volatility derivatives. Although non-affine due to the v_t^{3/2} term, option pricing in the 3/2 model can be performed using Fourier-Laplace transform methods, which involve solving a non-standard Riccati equation for the characteristic function. This approach enables semi-closed-form solutions for European options and variance swaps, making the model suitable for pricing VIX futures and options, where it accurately reproduces the observed upward-sloping skew in volatility-of-volatility without requiring jumps. The model was introduced by Lewis in his 2000 monograph on stochastic volatility option valuation, with significant refinements for derivative pricing provided by Drimus in 2012, who developed efficient transform-based methods for realized variance options.

Discrete-Time and Advanced Models

GARCH Models

Generalized Autoregressive Conditional Heteroskedasticity (GARCH) models represent a class of discrete-time stochastic volatility frameworks designed to capture time-varying variance in financial time series, particularly the phenomenon of volatility clustering observed in asset returns. Introduced by Tim Bollerslev in 1986 as an extension of Engle's ARCH model, GARCH incorporates both lagged squared residuals and lagged conditional variances into the variance equation, allowing for a more parsimonious representation of persistence in volatility. These models assume that returns follow a process where the conditional variance evolves recursively, making them suitable for forecasting volatility in econometric applications. The canonical GARCH(1,1) model specifies the \sigma_t^2 as \sigma_t^2 = \omega + \alpha \varepsilon_{t-1}^2 + \beta \sigma_{t-1}^2, where \varepsilon_t = \sigma_t z_t and z_t \sim N(0,1) is an i.i.d. innovation process, with parameters \omega > 0, \alpha \geq 0, and \beta \geq 0 ensuring non-negativity of variance. This formulation captures through the autoregressive structure, where high (low) past shocks or variances lead to elevated (suppressed) future . A key property is the persistence parameter \alpha + \beta, which is typically close to 1 in empirical applications, indicating in volatility shocks without implying non-stationarity. The unconditional variance is then \omega / (1 - \alpha - \beta), provided \alpha + \beta < 1 for covariance stationarity. Extensions of the basic GARCH model address limitations such as excessive persistence or asymmetry in volatility responses. The Integrated GARCH (IGARCH) model, proposed by Engle and Bollerslev in 1986, imposes \beta = 1 - \alpha to model infinite variance persistence, where shocks have permanent effects on future volatility levels. This is particularly useful for series exhibiting unit-root-like behavior in variance. To capture the leverage effect—where negative shocks increase volatility more than positive ones—the Exponential GARCH (EGARCH) model, developed by Nelson in 1991, parameterizes the log-variance as \ln(\sigma_t^2) = \omega + \beta \ln(\sigma_{t-1}^2) + \gamma \left| \frac{\varepsilon_{t-1}}{\sigma_{t-1}} \right| + \delta \frac{\varepsilon_{t-1}}{\sigma_{t-1}}, allowing asymmetric impacts via the sign term \delta < 0. In high-frequency settings, GARCH models can incorporate realized volatility measures derived from intraday data, and as the observation interval approaches zero, certain GARCH specifications converge to continuous-time diffusions such as the . Practically, GARCH variants underpin risk management tools like J.P. Morgan's system, which employs an IGARCH(1,1) with \alpha = 0.06 and \beta = 0.94 to compute (VaR) metrics based on exponentially weighted historical variances.

Rough Volatility Models

Rough volatility models represent a class of stochastic volatility frameworks that capture the empirically observed roughness in asset price volatility processes, characterized by paths that exhibit less regularity than those of standard . These models employ (fBM) or related Volterra processes to drive volatility, with a Hurst parameter H < 1/2 inducing anti-persistent, non-differentiable trajectories that align with high-frequency volatility data. Unlike smoother Markovian models such as the , rough volatility processes are non-Markovian, depending on the entire path history, which enables them to replicate the steep short-term skew and power-law decay in at-the-money (ATM) volatility skews observed in equity indices. The core framework models the log-volatility (or variance) as a fractional process. A canonical representation is the Volterra-type stochastic integral for the forward variance curve: v_t = \int_0^t (t - s)^{H - 1/2} \, dW_s, where W_s is a standard Brownian motion, and the kernel (t - s)^{H - 1/2} (up to normalization) generates the roughness for H \approx 0.1, as empirically estimated from realized volatility series of assets like the S&P 500 (SPX). This construction extends the classical Heston model to a "rough Heston" variant by replacing the Brownian increment with an fBM-driven term: dv_t = \kappa (\theta - v_t) \, dt + \xi \sqrt{v_t} \, dW_t^H, where W_t^H is fBM with Hurst index H < 1/2, \kappa is the mean-reversion speed, \theta the long-term variance, and \xi the volatility-of-volatility. The roughness (H < 1/2) ensures non-Markovian dependence, while the model fits SPX implied volatility surfaces superiorly to smooth alternatives, capturing the near-exponential decay in short-dated ATM skews with a power-law exponent close to H - 1/2 \approx -0.4. Key properties include the anti-persistent nature of paths for H < 1/2, leading to volatility bursts followed by rapid mean reversion, which mirrors microstructure effects in order book dynamics without invoking discrete jumps. Simulation of these paths relies on efficient schemes, such as the supremum Dirichlet kernel approximation for the fractional kernel, enabling fast Monte Carlo generation despite the non-Markovian structure. For option pricing, no closed-form solutions exist, necessitating numerical methods like Monte Carlo simulation with pathwise approximations or neural network-based surrogates to solve the associated fractional Riccati equations. These approaches scale the volatility-of-volatility in classical as a rough proxy, achieving high accuracy for SPX calibration. In the 2020s, extensions have proliferated, including the rough SABR model, which incorporates fractional volatility into the stochastic-alpha-beta-rho (SABR) framework to better model interest rate and FX smiles across maturities via short-time asymptotics. Machine learning techniques, such as deep neural networks, have also advanced calibration by optimizing parameters directly against vol surfaces, reducing computational costs for multi-asset rough Heston fits. By 2023, a comprehensive reference book on rough volatility was published, and research through 2025 has advanced numerical methods like weak error estimates and signature-based rough path calibrations.

Estimation and Calibration

Parameter Estimation Techniques

Parameter estimation in stochastic volatility (SV) models is complicated by the latent nature of the volatility process, which is unobserved and must be inferred from asset price data, typically daily returns. Common approaches include maximum likelihood estimation (MLE), Bayesian methods, filtering techniques, and non-parametric proxies, each addressing the integration over the hidden state in different ways. These methods aim to estimate parameters such as the mean reversion speed \kappa, long-run variance \theta, volatility of volatility \sigma, and correlation \rho in models like the Heston framework. Maximum likelihood estimation for continuous-time SV diffusions often relies on discretizing the process, such as via Euler-Maruyama approximation, to compute the transition densities p(\Delta S_i | S_{i-1}, \theta) between observed log-returns \Delta S_i. The log-likelihood is then approximated as \sum_i \log p(\Delta S_i | \theta), maximized numerically over the parameter vector \theta. This approach handles the intractability of exact densities by simulating paths or using series expansions, though it can suffer from discretization bias in small samples. For instance, in the , higher-order approximations reduce bias while maintaining computational feasibility. A key advantage is its asymptotic efficiency, outperforming method-of-moments estimators in finite samples. Bayesian methods treat parameters and latent volatilities as random variables, sampling from the posterior distribution p(\theta, \{v_t\} | \{S_t\}) using Markov chain Monte Carlo (MCMC) techniques like the Metropolis-Hastings algorithm. In the basic SV model, where \log v_t = \phi \log v_{t-1} + \eta_t, MCMC draws from conditional posteriors via data augmentation, integrating out the latent \{v_t\} sequence. For non-linear models like Heston, extensions incorporate particle filters within MCMC to approximate the likelihood, enabling joint estimation of parameters and states while providing credible intervals. These methods excel in handling parameter uncertainty but require careful tuning for convergence, especially with persistent volatility. Seminal work demonstrates superior performance over quasi-MLE in forecasting volatility. Filtering approaches sequentially estimate the latent volatility given observations. For linearized or Gaussian SV models, the Kalman filter provides efficient quasi-maximum likelihood estimates by treating volatility as a state variable in a linear dynamic system, though it assumes linearity and can be inefficient under strong non-Gaussianity. Sequential Monte Carlo (SMC), or particle filters, extend this to general non-linear SV models like by propagating particles to approximate the filtering density p(v_t | \{S_{1:t}\}), often combined with MCMC for parameter inference. SMC handles path dependence and jumps effectively, with resampling to combat degeneracy, but increases computational demands. As a discrete-time analog, quasi-maximum likelihood via Kalman filtering in models shares similar principles but assumes conditional heteroskedasticity rather than full stochasticity. Non-parametric methods bypass model assumptions by using high-frequency data to construct realized variance as a proxy for integrated volatility \int v_t dt, summed from squared intraday returns. This estimator converges to the true quadratic variation under suitable sampling, serving as an input for two-step parametric estimation or direct volatility forecasting in SV contexts. It avoids reliance on distributional forms but requires microstructure noise corrections for accuracy. Estimation faces inherent challenges due to the unobserved v_t, leading to incomplete information and potential non-identifiability; for example, in mean-reverting processes, high \kappa and low \theta may mimic low persistence, complicating separation of \kappa and \theta. Latent variables amplify simulation errors, and near-unit-root persistence slows MCMC mixing. These issues necessitate robust diagnostics and auxiliary data like options for identifiability, though historical returns alone demand careful regularization.

Model Calibration to Market Data

Model calibration to market data in stochastic volatility frameworks primarily involves adjusting model parameters to match observed implied volatilities derived from option prices across various strikes K and maturities T. This process ensures the model replicates the volatility smile or skew observed in derivative markets, enabling accurate pricing and hedging. The standard approach employs a least-squares optimization criterion, minimizing the sum of squared differences between market-implied volatilities \sigma_{\text{imp}}(K,T) and those generated by the model \sigma_{\text{model}}(K,T;\theta), where \theta denotes the parameter vector. This objective function is formulated as \min_{\theta} \sum_{K,T} \left( \sigma_{\text{imp}}(K,T) - \sigma_{\text{model}}(K,T;\theta) \right)^2, often weighted by bid-ask spreads or maturities to prioritize liquid instruments. In the Heston model, calibration requires computing model-implied volatilities via the semi-closed-form Fourier pricing formula, which involves numerical integration of the characteristic function to obtain the probabilities \Pi_1 and \Pi_2. These probabilities necessitate careful numerical root-finding to select the appropriate branch of the complex logarithm in the integrand, typically using methods like the secant rule or bisection to ensure stability and convergence. Due to the non-convex nature of the optimization landscape, global search algorithms such as differential evolution are employed to avoid local minima and achieve robust parameter fits. For the SABR model, calibration directly targets the volatility smile using Hagan's asymptotic approximation formula, which provides an explicit expression for implied volatility as a function of strike and parameters \alpha, \beta, \rho, and \nu. This allows for efficient least-squares fitting of these parameters to market data, often slice-by-slice across maturities, capturing the smile's skew and curvature without extensive numerical integration. The method's simplicity facilitates rapid calibration in interest rate and FX markets. Joint calibration across multiple assets or curves extends the univariate framework by simultaneously optimizing parameters for correlated instruments, incorporating cross-asset dependencies through copula structures or multi-factor dynamics. To enhance stability and prevent overfitting, regularization techniques such as L2 penalties on parameter deviations or smoothness constraints on the implied surface are applied, balancing fit quality with model parsimony. Practical implementation leverages open-source libraries like in Python, which provides optimized routines for and calibrations, including built-in optimizers and Fourier integrators. For rough volatility models, post-2020 advancements in GPU acceleration have enabled faster Monte Carlo-based calibrations by parallelizing path simulations and likelihood evaluations, reducing computation times from hours to minutes for high-dimensional fits.

Applications

Option Pricing and Derivatives

Stochastic volatility models improve upon the constant volatility assumption of the Black-Scholes framework by incorporating time-varying volatility, enabling more accurate pricing of options and derivatives under risk-neutral measures. In these models, the underlying asset price S_t and volatility v_t follow coupled stochastic differential equations (SDEs), such as dS_t = r S_t dt + \sqrt{v_t} S_t dW_t^S and dv_t = \kappa(\theta - v_t) dt + \sigma \sqrt{v_t} dW_t^v, where r is the risk-free rate, \kappa is the mean reversion speed, \theta is the long-term variance, \sigma is the volatility of volatility, and W_t^S, W_t^v are Brownian motions with correlation \rho. This setup captures the empirical observation that volatility is stochastic and mean-reverting, leading to better alignment with observed option prices across strikes and maturities. For vanilla European options, pricing relies on risk-neutral valuation via the Feynman-Kac theorem, which links the option value to the solution of a partial differential equation (PDE) derived from the model's SDEs. In the Heston model, a specific stochastic volatility framework, the European call price admits a semi-closed-form solution using Fourier inversion, expressing the characteristic function of the log-asset price under the risk-neutral measure. For more general stochastic volatility models, such as the SABR or 3/2 models, closed-form solutions are typically unavailable, necessitating numerical methods like Monte Carlo simulation or PDE solvers to compute expectations of the discounted payoff. These approaches ensure the price equals the risk-neutral expectation C = e^{-r\tau} \mathbb{E}^\mathbb{Q} [\max(S_T - K, 0)], where \tau is time to maturity and K is the strike. The Greeks in stochastic volatility models reflect sensitivities to both asset price and volatility dynamics. Vega, measuring sensitivity to changes in the current volatility level v_0, arises directly from the dependence of the diffusion term on v_t, often computed via finite differences or adjoint methods in numerical schemes. The correlation parameter \rho influences the delta (sensitivity to S_t) by coupling asset and volatility shocks, typically resulting in negative \rho values that induce negative skewness in returns and higher prices for out-of-the-money puts. Rho, the sensitivity to the interest rate r, is modulated by the stochastic nature of volatility, differing from its Black-Scholes counterpart due to the integrated variance's randomness. Exotic options under stochastic volatility exhibit payoffs that interact with volatility paths, complicating pricing. For barrier options, knock-in or knock-out conditions depend on volatility-driven excursions, as the asset's diffusion coefficient \sqrt{v_t} S_t amplifies barrier hits during high-volatility periods; Monte Carlo with variance reduction or PDE methods are commonly employed. Variance swaps, which pay the difference between realized quadratic variation \int_0^\tau v_t dt and a fixed strike, have fair strikes given by \mathbb{E}^\mathbb{Q} [\int_0^\tau v_t dt] = \theta \tau + (v_0 - \theta) \frac{1 - e^{-\kappa \tau}}{\kappa}, accounting for mean reversion from the initial variance, often computed via moment-matching or Fourier methods in affine models. These instruments directly hedge volatility exposure, with stochastic models providing more precise valuations than constant-volatility approximations. Stochastic volatility naturally generates an endogenous implied volatility smile or skew without invoking jumps, as the leverage effect (negative \rho) and volatility clustering produce asymmetric risk-neutral distributions. Short-term skews arise from near-term volatility shocks, while long-term smiles reflect mean reversion to \theta, aligning model-implied surfaces with market data on equity and FX options. This dynamic smile evolution contrasts with the flat volatility in Black-Scholes, which serves as the constant-volatility limit when \sigma \to 0. Numerical methods are essential for solving the pricing PDE in stochastic volatility models, given by \frac{\partial C}{\partial t} + r S \frac{\partial C}{\partial S} + \kappa(\theta - v) \frac{\partial C}{\partial v} + \frac{1}{2} v S^2 \frac{\partial^2 C}{\partial S^2} + \frac{1}{2} \sigma^2 v \frac{\partial^2 C}{\partial v^2} + \rho \sigma v S \frac{\partial^2 C}{\partial S \partial v} = r C, with terminal condition C(T, S, v) = \max(S - K, 0). Finite difference schemes, such as Crank-Nicolson for time-stepping and implicit methods for stability, discretize this two-dimensional PDE on a (S, v) grid, efficiently handling the Feller condition $2\kappa\theta > \sigma^2 to bound v_t > 0. For high-dimensional exotics, least-squares or Fourier-cosine expansions accelerate convergence, ensuring practical implementation in trading systems.

Risk Management and Portfolio Optimization

In stochastic volatility models, hedging strategies must account for the joint dynamics of asset prices and to mitigate risks from correlated shocks. Dynamic -vega hedging involves adjusting positions in the underlying asset and options to neutralize both price () and () exposures, which is particularly effective in models like where the correlation parameter ρ between asset returns and volatility innovations influences performance. For instance, when ρ is negative (reflecting the leverage effect where falling prices coincide with rising ), -vega strategies reduce hedging error variance by approximately 40% compared to pure hedging, as they compensate for the induced volatility bias. Minimum variance ratios incorporate ρ to minimize variance by including cross-terms between asset returns and volatility changes, outperforming Black-Scholes deltas especially under high |ρ| values, where the latter underhedges for ρ < 0 and overhedges for ρ > 0. Value-at-Risk () and (ES) computations under stochastic volatility leverage the model's ability to capture time-varying risk. Historical methods generate SV paths by rescaling past returns with forecasted volatilities from the model, producing more accurate tail estimates than constant volatility assumptions; for example, in equity markets, this yields levels like 9.78% for the over 5 days. Parametric approaches use moment-matching, where standardized residuals from the SV process are scaled by conditional forecasts to derive distributional moments, enabling efficient /ES calculation without full . These techniques outperform GARCH-based in capturing heteroskedasticity, particularly for international equities where SV models generate lower but more stable risk measures across horizons. Portfolio optimization in stochastic volatility frameworks extends mean-variance analysis to incorporate uncertain premia. Investors maximize expected portfolio return μ_p minus a penalty λ times the square root of the expected integrated variance, √E[∫ v_t dt], which accounts for the nature of in decisions. This objective leads to dynamic strategies that adjust weights toward assets with favorable volatility- correlations, improving the compared to static models; numerical solutions under CIR-type processes show reduced allocation to risky assets during high- regimes. Time-consistent implementations via backward equations ensure the policy remains optimal over multi-period horizons, balancing myopic demands with hedging against future shocks. Stress testing with stochastic volatility models simulates extreme scenarios like volatility spikes, often triggered by low mean reversion κ or high volatility-of-volatility ξ parameters, to assess portfolio resilience. During the 2020 COVID-19 crisis, outlier-augmented SV models captured persistent volatility surges by combining transitory shocks with structural changes, improving forecast accuracy for U.S. macroeconomic variables and reducing bias in tail risk estimates. These applications revealed heightened default risks for over-leveraged firms under vol spikes exceeding 50%, informing regulatory stress tests. For multi-asset portfolios, stochastic integrates with copulas to model across volatilities, capturing dependencies beyond linear correlations. Copula-based approaches, such as time-varying symmetric Joe-Clayton (SJC) models combined with multifractal filtering, quantify asymmetric ; for example, U.S. shocks post-2008 propagated to equities with upper dependence parameters rising to 0.25, highlighting spillover risks during crises. This framework enhances risk aggregation by linking marginal SV distributions via copulas, enabling better diversification under joint vol extremes.

Empirical Evidence and Limitations

Stochastic volatility (SV) models have been empirically validated for their ability to capture key stylized facts in financial time series, such as the implied volatility smile observed in option prices. For instance, analyses of S&P 500 index options demonstrate that SV frameworks, including extensions like the Heston model, successfully reproduce the skew and smile patterns by allowing volatility to vary stochastically, outperforming constant volatility assumptions in fitting market data from the 1990s onward. Similarly, these models align with the volatility smile in international stock indices, where jumps in returns enhance the fit to higher-order moments of return distributions. SV models also effectively explain , a where large changes in asset returns tend to be followed by further large changes. GARCH variants, as discrete-time approximations of SV processes, provide strong empirical fits to daily return data, capturing persistence in shocks across equity markets like the over multi-year periods. For example, GARCH(1,1) specifications have been shown to predict future accurately using daily returns from 2000 to 2011, confirming the model's robustness in modeling clustered volatility regimes. Rough volatility models, characterized by a Hurst exponent H < 0.5, offer superior empirical performance in replicating the "explosion" or rapid short-term increase in forward volatility curves, a feature prominent in high-frequency data. These models, driven by fractional Brownian motion-like paths, match observed roughness in log-volatility trajectories from assets like the , with estimated H values around 0.1-0.2 providing better calibration to option surfaces than classical models. Empirical studies on SPX and options further confirm that rough SV frameworks outperform smooth alternatives in capturing the correlation between Hurst exponents and volatility levels, enhancing fits to forward-starting option prices. Seminal empirical work, such as Eraker's 2004 Bayesian analysis using MCMC methods on data, supports the integration of jumps into SV models, revealing significant evidence for simultaneous jumps in prices and to reconcile spot and option . More recent investigations, including those by Abi Jaber and collaborators, highlight the empirical advantages of rough approximations, such as multifactor Markovian structures that mimic rough paths while improving tractability for SPX option . Despite these strengths, SV models face notable limitations. Calibration to market data often leads to overfitting, particularly in high-dimensional settings where numerous parameters strain goodness-of-fit without improving out-of-sample predictions. Pure SV frameworks frequently underperform by ignoring jumps in returns and volatility, necessitating hybrid extensions to adequately capture fat-tailed distributions in equity returns. Additionally, rough volatility models incur high computational costs due to the challenges of simulating non-Markovian paths, limiting their practical implementation in real-time applications. Early SV models overlooked the rough path properties now central to modern frameworks, while recent integrations (post-2020) with for non-parametric estimation have enhanced flexibility beyond parametric assumptions, including approaches for option pricing and calibration. Looking ahead, offers promise for accelerating simulations in SV option pricing, reducing the exponential complexity of path-dependent evaluations.

References

  1. [1]
    [PDF] Stochastic volatility - Nuffield College
    Stochastic volatility (SV) is the main concept used in the fields of financial economics and mathematical finance to deal with the endemic time-varying ...
  2. [2]
    [PDF] Stochastic Volatility: Origins and Overview∗ - Nuffield College
    Mar 3, 2008 · Stochastic volatility (SV) models are used heavily within the fields of financial economics and math- ematical finance to capture the impact ...
  3. [3]
    The Pricing of Options on Assets with Stochastic Volatilities - jstor
    This paper examines this problem. The option price is determined in series form for the case in which the stochastic volatility is independent of the stock ...Missing: original | Show results with:original
  4. [4]
    [PDF] A Closed-Form Solution for Options with Stochastic Volatility with ...
    This paper derives a closed-form solution for European call options with stochastic volatility, allowing correlation between volatility and spot-asset returns, ...
  5. [5]
    [PDF] Stochastic Volatility - Torben G. Andersen and Luca Benzoni
    Most notably, the term is used to identify the volatility implied by the Black and Scholes [63] option pricing formula.
  6. [6]
    [PDF] Chapter 16 Stochastic Volatility
    The needs for stochastic volatility models come from the observations that volatil- ities (historical or implied) of asset prices as time series certainly ...
  7. [7]
    [PDF] AN INTRODUCTION TO STOCHASTIC VOLATILITY MODELS
    This model is based on the assumption that the log returns of a stock price are independent and normally distributed, with variance proportional to the time ...
  8. [8]
    Option Pricing when the Variance Changes Randomly - jstor
    Scott 429 tion breaks down. Since these estimators are functions of sample ... "Stochastic Volatility Option Valuation: Theory and Empirical Estimates.".
  9. [9]
    Stock Price Distributions with Stochastic Volatility: An Analytic ...
    We study the stock price distributions that arise when prices follow a diffusion process with a stochastically varying volatility parameter.
  10. [10]
    [PDF] Smile Risk - Deriscope
    “stochastic-αβρ model,” which has become known as the SABR model. In this model, the forward price and volatility are d ˆF = ˆα ˆFβ dW1,. ˆF (0 ) = f. (2.15a).
  11. [11]
    [1410.3394] Volatility is rough - arXiv
    Oct 13, 2014 · Authors:Jim Gatheral, Thibault Jaisson, Mathieu Rosenbaum. View a PDF of the paper titled Volatility is rough, by Jim Gatheral and 1 other ...
  12. [12]
    Observations concerning the estimation of Heston's stochastic ...
    May 8, 2025 · This paper presents a comprehensive simulation study on estimating parameters for the popular Heston stochastic volatility model.
  13. [13]
    [PDF] Probability distribution of returns in the Heston model with stochastic ...
    Dec 2, 2002 · The Heston model's return distribution is exponential in log-returns for large returns, Gaussian for small returns, and scaling with a Bessel ...
  14. [14]
    Valuation of power options under Heston's stochastic volatility model
    Unlike the majority of stochastic volatility models, under Heston model there remains a significant problem to check the existence of moments of assets prices ...
  15. [15]
    A General Asymptotic Implied Volatility for Stochastic Volatility Models
    Apr 13, 2005 · When the lambda-SABR model degenerates into the SABR-model, we show that our asymptotic implied volatility is a better approximation than the ...
  16. [16]
    [PDF] LIBOR market model with SABR style stochastic volatility
    SABR is a conceptually simple and flexible stochastic volatility model used to capture the volatility smile on caps/floors and swaptions. It is particularly ...
  17. [17]
    A unified model of SABR and mean-reverting stochastic volatility for ...
    The SABR model is one of the stochastic-local volatility models popularly used by practitioners, especially in the interest rate derivative and foreign exchange ...
  18. [18]
    [PDF] Consistent Modelling of VIX and Equity Derivatives Using a 3/2 plus ...
    Abstract. The paper demonstrates that a pure-diffusion 3/2 model is able to capture the observed upward-sloping implied volatility skew in VIX options.
  19. [19]
    A Non-Affine Stochastic Volatility Model by Gabriel G. Drimus
    In this paper we study the pricing and hedging of options on realized variance in the 3/2 non-affine stochastic volatility model, by developing efficient ...
  20. [20]
    [PDF] Pricing Exotic Discrete Variance Swaps under the 3/2 Stochastic ...
    The Heston stochastic volatility model has been commonly used for pricing variance and volatility derivatives due to its affine structure that lead to nice ...
  21. [21]
    (PDF) Option Valuation Under Stochastic Volatility - ResearchGate
    Apr 2, 2019 · We consider the 3/2 model introduced in [Heston, 1999; Lewis, 2000] which is a non-affine stochastic volatility model whose analytical ...
  22. [22]
    Generalized autoregressive conditional heteroskedasticity
    View PDF; Download full issue. Search ScienceDirect. Elsevier. Journal of Econometrics · Volume 31, Issue 3, April 1986, Pages 307-327. Journal of Econometrics ...
  23. [23]
    Conditional Heteroskedasticity in Asset Returns: A New Approach
    Mar 1, 1991 · A new form of ARCH is proposed that meets these objections. The method is used to estimate a model of the risk premium on the CRSP Value-Weighted Market Index.Missing: EGARCH | Show results with:EGARCH
  24. [24]
    Closed-Form GARCH Option Valuation Model - Oxford Academic
    The single lag version of this model contains Heston's (1993) stochastic volatility model as a continuous-time limit. Empirical analysis on S&P500 index ...
  25. [25]
    [PDF] RiskMetrics Technical Document - Fourth Edition 1996, December
    RiskMetrics is based on, but differs significantly from, the risk measurement methodology developed by J.P. Morgan for the measurement, management, ...
  26. [26]
    Roughening Heston
    Feb 14, 2018 · We present here the rough Heston model which offers the best of both worlds. Even better, we find that we can accurately approximate rough Heston model values.
  27. [27]
    [2105.05359] A rough SABR formula - arXiv
    May 11, 2021 · We derive an ODE that is satisfied by normalized volatility smiles for short maturities under a rough volatility extension of the SABR model.
  28. [28]
    [2107.01611] Deep calibration of the quadratic rough Heston model
    Jul 4, 2021 · The quadratic rough Heston model provides a natural way to encode Zumbach effect in the rough volatility paradigm.Missing: machine | Show results with:machine
  29. [29]
    [PDF] ESTIMATION METHODS FOR STOCHASTIC VOLATILITY MODELS
    In this article, we review this literature. We describe the main estimators of the parameters and the underlying volatilities focusing on their advantages and ...
  30. [30]
    [PDF] Maximum likelihood estimation of stochastic volatility models
    This paper develops a method for maximum likelihood estimation of stochastic volatility models, comparing full and approximate likelihood procedures.
  31. [31]
    Parameter estimation for discretely observed stochastic volatility ...
    This paper deals with parameter estimation for stochastic volatility models. We consider a two- dimensional diffusion process (Yt, Vt).
  32. [32]
    Bayesian Analysis of Stochastic Volatility Models
    Jul 2, 2012 · We illustrate our method by analyzing both daily and weekly data on stock returns and exchange rates. Sampling experiments are conducted to ...
  33. [33]
    Bayesian Analysis of Stochastic Volatility Models - ResearchGate
    Aug 28, 2015 · New techniques for the analysis of stochastic volatility models are developed. A Metropolis algorithm is used to construct a Markov Chain simulation tool.
  34. [34]
    Quasi-maximum likelihood estimation of stochastic volatility models
    This article analyses the asymptotic and finite sample properties of a Quasi-Maximum Likelihood (QML) estimator based on the Kalman filter.
  35. [35]
  36. [36]
    [PDF] The Realized Laplace Transform of Volatility
    Sep 15, 2011 · The realized variance is the sum of squared returns over a given time period, usually a day, and it is a nonparametric measure of the unobserved ...<|separator|>
  37. [37]
    [PDF] STOCHASTIC VOLATILITY MODELS: CALIBRATION, PRICING ...
    the application of the least-squares optimisation method to model calibration. ... still provides a very fast way of calibrating the model to market data.
  38. [38]
    [PDF] Calibration of a Libor Market Model with Stochastic Volatility
    To incorporate the market data and to obtain the model parameters of the swap dynamics in (7.66) we solve an ordinary, unweighted least-square problem (3.5).
  39. [39]
  40. [40]
    [PDF] SWIFT calibration of the Heston model - arXiv
    Mar 2, 2021 · The goal of calibrating a model using market data is to estimate the model parameters in such a way that, when it is used for option valuation ...
  41. [41]
    Calibration of Interest Rate and Option Models Using Differential ...
    Mar 24, 2009 · We propose an efficient method for using Differential Evolution to provide fast, reliable calibrations for any pricing model.Missing: numerical root- Pi1 Pi2
  42. [42]
    [PDF] On the Calibration of the SABR Model and its Extensions
    The SABR model attempts to capture the volatility smile structure by modelling a single forward rate with an stochastic volatility dynamics. Hagan and his co- ...
  43. [43]
    Full article: Joint calibration of local volatility models with stochastic ...
    Oct 22, 2024 · This paper presents a non-parametric method for joint calibration of local volatility and stochastic short rate models using semimartingale ...
  44. [44]
    [PDF] An Efficient Calibration Framework for Volatility Derivatives under ...
    Oct 21, 2025 · We instantiate the workflow on a rough volatility model with tempered-stable jumps tailored to power-type volatility derivatives and calibrate ...Missing: post- | Show results with:post-
  45. [45]
    (PDF) Option Hedging with Stochastic Volatility - ResearchGate
    Oct 22, 2018 · We present a simple stochastic volatility ... This study will consider the delta-, delta-vega, and the minimum variance hedge for the Heston (1993) ...
  46. [46]
    Using the short-lived arbitrage model to compute minimum variance ...
    Jul 28, 2020 · Motivated by this model, we imply both volatility and virtual interest rates to adjust minimum variance hedge ratios. Using several error ...
  47. [47]
    None
    ### Summary: Stochastic Volatility Models for VaR Calculation
  48. [48]
    [PDF] The Stochastic Volatility Model, Regime Switching and Value-at ...
    May 31, 2017 · In this paper, we estimate two stochastic volatility models applied to interna- tional equity markets. The two models are the log-normal ...
  49. [49]
    Dynamic optimal mean-variance portfolio selection with stochastic ...
    Sep 1, 2022 · This paper studies optimal portfolio selection problems in the presence of stochastic volatility and stochastic interest rate under the mean-variance criterion.
  50. [50]
    Optimal Portfolio Selection of Mean‐Variance Utility with Stochastic ...
    Nov 19, 2020 · In this paper, we come up with an analytical solution for the portfolio optimization problem that contains a stochastic (short-term) interest ...
  51. [51]
    Addressing COVID-19 Outliers in BVARs with Stochastic Volatility
    We propose VAR models with outlier-augmented SV that combine transitory and persistent changes in volatility. The resulting density forecasts for the COVID-19 ...Missing: stress vol spikes
  52. [52]
    Financial market volatility and contagion effect: A copula–multifractal ...
    The main contribution of this paper is the combination of multifractal methods and copulas to study the contagion effect between the U.S. and Chinese stock ...Missing: multi- | Show results with:multi-
  53. [53]
    The Dynamics of the S&P 500 Implied Volatility Surface
    This empirical study is motivated by the literature on “smile-consistent” arbitrage pricing with stochastic volatility. We investigate the number and shape ...
  54. [54]
    Further empirical evidence on stochastic volatility models with jumps ...
    We estimate a continuous time diffusion for the stochastic volatility of some international stock market indices that allows for possible jumps in returns.
  55. [55]
    [PDF] Volatility Clustering in the S&P 500 Index: An ARCH/GARCH ...
    The data used include daily returns of the index over a period of 11 years. We find that a GARCH(1,1) process to be the most effective predictor of future ...
  56. [56]
    [PDF] Volatility is rough - arXiv
    Oct 13, 2014 · Volatility is rough. Jim Gatheral. Baruch College, City University of ... Rosenbaum. Large tick assets: Implicit spread and optimal tick ...
  57. [57]
    Empirical analysis of rough and classical stochastic volatility models ...
    Abstract. We conduct an empirical analysis of rough and classical stochastic volatility models to the SPX and VIX options markets.
  58. [58]
    [PDF] Do Stock Prices and Volatility Jump? Reconciling Evidence from ...
    This paper examines the empirical performance of jump diffusion models of stock price dynamics from joint options and stock markets data.
  59. [59]
    Multifactor Approximation of Rough Volatility Models - SIAM.org
    In this paper, we design tractable multifactor stochastic volatility models approximating rough volatility models and enjoying a Markovian structure.
  60. [60]
    (PDF) Stochastic Volatility Models and Their Applications to ...
    Moreover, this paper explores the limitations of stochastic volatility models, acknowledging the trade-offs between complexity and practicality. While models ...
  61. [61]
    [PDF] Option pricing under stochastic volatility on a quantum computer
    Oct 23, 2024 · Our resource analysis suggests that option pricing under stochastic volatility is a promising application of quan- tum computers, and that our ...
  62. [62]
    Applications of Quantum Machine Learning for Quantitative Finance
    May 16, 2024 · Monte Carlo simulation is used to simulate possible future outcomes by generating random samples and is particularly useful for complex ...