Fact-checked by Grok 2 weeks ago

Dynamic stochastic general equilibrium

Dynamic stochastic general equilibrium (DSGE) models constitute a class of quantitative macroeconomic frameworks that derive aggregate economic dynamics from the optimizing behavior of representative households and firms, subject to stochastic shocks and general equilibrium constraints, typically solved via and intertemporal maximization. These models emphasize microeconomic foundations to ensure internal consistency and policy invariance, addressing the by embedding structural parameters derived from first-principles utility and production functions rather than ad hoc reduced forms. Originating in the real business cycle (RBC) paradigm pioneered by Kydland and Prescott in the early , DSGE models initially attributed fluctuations primarily to real shocks propagating through flexible-price economies, achieving notable success in replicating key regularities such as the comovement of output and under disturbances. Subsequent extensions, particularly New Keynesian variants incorporating nominal rigidities like sticky prices and wages, integrated rules (e.g., Taylor rules) and demand shocks, enabling analysis of inflation-output tradeoffs and stabilization policies; these have become standard tools at central banks for forecasting, shock decomposition, and counterfactual policy evaluation. Empirical estimation often employs Bayesian methods to discipline parameters with post- U.S. and euro area data, revealing shocks like and monetary disturbances as primary drivers of variances in GDP and inflation. While DSGE models' microfounded structure facilitates causal inference on policy effects—evident in their role simulating episodes and impacts—they face substantive criticisms for underperforming in capturing financial frictions and systemic crises, as evidenced by their limited foresight into the downturn and struggles to match asset price or heterogeneity-driven dynamics without ad hoc augmentations. Proponents argue that iterative refinements, including financial accelerator mechanisms and occasionally binding constraints, enhance empirical fit, yet detractors highlight persistent gaps in explaining up to 80% of variance in select macro aggregates, underscoring tensions between theoretical coherence and data fidelity in macroeconomic modeling.

Definition and Core Concepts

Fundamental Principles

Dynamic stochastic general equilibrium (DSGE) models are constructed from explicit microeconomic foundations, where representative households maximize intertemporal utility subject to budget constraints and firms maximize profits under technological constraints, ensuring that aggregate behavior emerges from optimizing decisions of individual agents. This approach contrasts with reduced-form models by deriving macroeconomic dynamics directly from primitive assumptions about preferences, technology, and market structures. form a core assumption, positing that agents form forecasts of future variables using all available , including the model's , which eliminates systematic forecast errors and ensures between private and statistical expectations. The dynamic aspect arises from forward-looking optimization, where agents solve infinite-horizon problems, leading to Euler equations that link current consumption or investment to expected future marginal utilities, capturing phenomena like habit persistence or adjustment costs. Stochastic elements introduce exogenous shocks, typically modeled as innovations to (e.g., AR(1) processes with persistence ρ ≈ 0.95 in real business cycle calibrations), preferences, or policy variables, which propagate through the via general interactions. General requires that all markets clear simultaneously, with prices adjusting to equate , often assuming flexible prices in baseline real models or nominal rigidities in extensions like New Keynesian frameworks with Calvo pricing where only a fraction θ ≈ 0.75 of firms update prices each period. These principles, originating in the real paradigm of Kydland and Prescott (1982), emphasize that result primarily from real shocks rather than monetary factors, with model solutions computed via methods like log-linearization around the to analyze deviations driven by shocks. Empirical validation involves matching simulated moments, such as the of output (standard deviation ≈ 1.6% quarterly) and correlations with hours worked, to U.S. data from 1954–2000, though extensions incorporate frictions to better fit evidence on inflation persistence. The framework's insistence on equilibrium discipline ensures internal consistency but has been critiqued for assuming complete markets and representative agents, potentially overlooking heterogeneity evident in micro data.

Terminology and Distinctions

The DSGE breaks down into its constituent elements, each delineating a foundational aspect of the modeling approach. "Dynamic" signifies that agents—households maximizing over time and firms optimizing profits intertemporally—make forward-looking decisions, with economic variables evolving across or continuous periods rather than in a single static snapshot. "" denotes the incorporation of probabilistic shocks, typically modeled as exogenous random processes (e.g., autoregressive innovations to or ), which perturb the from its steady-state path and generate fluctuations. "General " requires that all markets clear simultaneously, with relative prices adjusting to equate and across goods, labor, and , derived from decentralized optimizing rather than imposed aggregates. DSGE models are differentiated from deterministic frameworks by their reliance on stochastic processes to capture uncertainty and variance in outcomes, enabling simulation of impulse responses to shocks via methods like perturbation or value function iteration around a steady state. In contrast to partial equilibrium models, which abstract from spillovers by holding other markets fixed (e.g., ceteris paribus analysis in microeconomics), DSGE enforces economy-wide consistency, tracing general equilibrium effects such as how a technology shock propagates through labor supply, investment, and consumption. Rational expectations further distinguish DSGE from adaptive or backward-looking alternatives, positing that agents form unbiased forecasts using all available information, leading to self-fulfilling prophecies in policy responses. A key subclassification within DSGE is between real business cycle (RBC) models and extensions. RBC models, the progenitors of DSGE, assume flexible prices and wages with complete market clearing, attributing business cycles primarily to real shocks like innovations that alter the natural rate of output. NK DSGE models retain the dynamic-stochastic-equilibrium structure but introduce nominal rigidities—such as Calvo-style price stickiness where only a fraction of firms adjust prices each period—to rationalize monetary non-neutrality and demand-driven fluctuations, distinguishing them from RBC's emphasis on supply-side real shocks alone. This friction allows NK variants to model deviations from the natural equilibrium, where output gaps arise due to sluggish price adjustments rather than instantaneous clearing.

Historical Development

Precursors in Neoclassical and Real Business Cycle Theory

laid the groundwork for DSGE models through its emphasis on general equilibrium, where markets clear simultaneously via interactions, as formalized by in 1874. This static framework evolved into dynamic models incorporating intertemporal optimization, notably in the Ramsey model of 1928, which derived optimal savings and consumption paths under perfect foresight. Further developments, such as the Cass-Koopmans extension in the 1960s, integrated exogenous technological progress into representative-agent optimization, establishing the neoclassical growth model as a cornerstone for analyzing long-run economic dynamics. These models assumed rational agents maximizing utility subject to budget constraints, presaging the central to DSGE, though they initially lacked elements and focused on deterministic steady states. The revolution in the 1970s, building on John Muth's 1961 hypothesis, challenged macroeconometric models by insisting agents form expectations consistent with the model's structure. Robert Lucas's 1976 critique highlighted how traditional reduced-form equations failed to account for agents' behavioral responses to policy shifts, rendering them unreliable for counterfactual analysis; he advocated microfounded models where parameters remain invariant to policy changes. Lucas and Edward Prescott's work on investment under uncertainty introduced stochastic productivity shocks into dynamic optimization, demonstrating how equilibrium asset prices emerge from forward-looking agents. These insights shifted toward equilibrium models with explicit , directly influencing the stochastic dynamic frameworks of later theories. Real business cycle (RBC) theory emerged as the immediate precursor to DSGE in the late 1970s and early 1980s, applying stochastic neoclassical growth models to business fluctuations. Finn Kydland and Edward Prescott's 1982 paper, "Time to Build and Aggregate Fluctuations," calibrated a multi-period model where technology shocks—modeled as random walks in —propagate through general equilibrium to generate cycles in output, employment, and matching U.S. postwar data. Unlike Keynesian approaches attributing cycles to demand deficiencies, RBC posited real supply-side shocks as primary drivers, with flexible prices ensuring and agents optimizing intertemporally under . Early RBC contributions, including Long and Plosser's 1983 multivariate shock propagation analysis, emphasized sectoral interdependencies amplifying aggregate volatility. These models operationalized dynamic stochastic general equilibrium by solving for policy functions via numerical methods like value function iteration, establishing over estimation and paving the way for DSGE extensions that retained RBC's core while adding frictions.

Emergence and Formalization of DSGE

The emergence of dynamic general (DSGE) models occurred in the early 1980s, building directly on real (RBC) theory, which emphasized technology shocks as drivers of economic fluctuations through microfounded intertemporal optimization by rational agents. Finn Kydland and Edward Prescott's 1982 paper introduced the foundational RBC framework, modeling the economy as a sequence of competitive equilibria subject to disturbances, calibrated to match U.S. data such as output volatility and persistence. This approach formalized dynamic general with elements, departing from earlier Keynesian models by insisting on explicit optimization under to avoid the . The term "DSGE" itself first appeared in Robert King and Charles Plosser's 1984 paper on RBC models, encapsulating the integration of dynamic optimization, stochastic processes, and general equilibrium clearing across markets over time. Early DSGE implementations, such as Gary Hansen's 1985 indivisible labor variant, refined RBC by incorporating realistic labor supply frictions while maintaining closed-form solutions via log-linear approximations around steady states. These models quantified impulse responses to shocks, demonstrating how real shocks could account for 70-90% of postwar U.S. output variance in calibrated exercises. Formalization accelerated in the through the (NNS), which merged RBC microfoundations with New Keynesian nominal rigidities to address empirical shortcomings like monetary non-neutrality. Key innovations included Julio Rotemberg and Michael Woodford's 1997 quadratic adjustment costs for prices and Guillermo Calvo's 1983 staggered pricing mechanism, enabling tractable aggregation in infinite-horizon settings. Marvin Goodfriend and Robert King's 1997 overview codified the NNS as a benchmark, featuring Euler equations for households, New Keynesian Phillips curves, and Taylor rules for , solved via perturbation methods. This era saw DSGE models gain prominence in central banks, with Bayesian estimation techniques—pioneered by Christopher Sims in the —allowing full-system likelihood evaluation and parameter discipline from data moments. By the late , models like those by Lawrence Christiano, Martin Eichenbaum, and Charles Evans incorporated habit formation and variable capital, achieving better fits to inflation-output dynamics without ad hoc elements.

Key Figures and Milestones

Finn E. Kydland and laid the quantitative foundation for DSGE models through their 1982 paper "Time to Build and Aggregate Fluctuations," which developed a real framework where technology shocks propagate through multi-sector production lags to generate observed fluctuations in a dynamic general setting. Their approach emphasized over traditional estimation to evaluate model fit against empirical data, marking a shift toward microfounded, equilibrium-based simulations of aggregate dynamics. Robert E. Lucas Jr. contributed foundational theoretical elements in the 1970s, including equilibria in his 1972 "Expectations and the " model, where agents in dispersed "islands" update beliefs based on noisy signals, yielding non-neutral monetary shocks with persistent real effects. Lucas's 1976 critique further underscored the need for DSGE-style models by arguing that policy-invariant structural parameters require explicit optimization under , rendering reduced-form unreliable for counterfactual analysis. Kydland and Prescott received the 2004 Nobel Prize in Economic Sciences for advancing dynamic macroeconomic analysis, including time-inconsistency problems from their 1977 work and the RBC paradigm's empirical discipline. In the 1990s, the framework evolved via the , integrating New Keynesian nominal frictions into DSGE structures, as synthesized by Goodfriend and King in 1997, facilitating hybrid models blending real and monetary propagation mechanisms.

Theoretical Foundations

Microfoundations and Rational Expectations

in dynamic stochastic general equilibrium (DSGE) models derive aggregate economic dynamics from the optimizing behavior of individual , including households that maximize intertemporal utility subject to budget constraints and firms that maximize profits given production technologies and conditions. These models emphasize agents' forward-looking decisions, incorporating constraints such as limits and sets, to generate paths that clear markets over time. The representative paradigm is commonly employed to simplify aggregation, positing a single whose choices replicate economy-wide outcomes under identical preferences and endowments, though this has been critiqued for overlooking heterogeneity in empirical distributions. Rational expectations form a cornerstone of these microfoundations, positing that economic agents form forecasts of future variables using all available information, including the economy's probabilistic structure, such that expectations are unbiased and equivalent to mathematical projections under the model's laws of motion. First proposed by John F. Muth in 1961 as a for firm expectations in competitive markets, it implies no systematic errors in predictions, contrasting with adaptive schemes reliant on past errors. Robert E. Lucas Jr. extended this to in the 1970s, integrating it with micro-optimizing agents to critique Keynesian models for ignoring expectation-driven behavioral shifts. In DSGE frameworks, ensure consistency between agents' beliefs and outcomes, solved via methods like perturbation around steady states or value function iteration, where agents' policy functions incorporate model-consistent forecasts of shocks and variables. This assumption facilitates the Lucas critique's application: policy rules alter agents' decision rules, rendering parameter estimates from historical data unreliable for counterfactuals unless expectations adjust endogenously. Empirical implementations, such as New Keynesian DSGE variants, embed in Euler equations for consumption and rules for policy, yielding log-linearized systems solvable for responses to shocks like or monetary disturbances. While enabling tractable general , the assumes of the model, an idealization tested against survey data revealing deviations, such as underreaction to news.

Stochastic Elements and Shocks

In DSGE models, stochastic elements manifest primarily through exogenous shocks that introduce randomness into the economy's evolution, serving as the primary drivers of fluctuations around the deterministic . These shocks represent unpredictable disturbances to underlying economic processes, such as shifts in or policy rules, to which optimizing agents respond under . Unlike deterministic models, the framework allows computation of probability distributions over future states, enabling analysis of uncertainty's effects on decisions like and . Shocks are conventionally modeled as stationary processes to ensure long-run stability, most often as first-order autoregressive (AR(1)) specifications in logarithms: \ln z_t = \rho \ln z_{t-1} + \epsilon_t, where $0 < \rho < 1 governs persistence, \epsilon_t \sim N(0, \sigma^2) is a white-noise innovation, and z_t scales the affected variable multiplicatively. This setup captures both transitory and persistent impacts while maintaining tractability for solution methods like log-linearization and Blanchard-Kahn algorithms. Some models employ ARMA(1,1) processes for shocks requiring greater flexibility to match empirical autocorrelations, such as wage or price mark-up disturbances: \ln \epsilon_{w,t} = (1 - \rho_w) \ln \bar{\epsilon}_w + \rho_w \ln \epsilon_{w,t-1} - \theta_w \eta_{w,t-1} + \eta_{w,t}. A canonical set of shocks in medium-scale DSGE models includes seven orthogonal structural disturbances: total factor productivity shocks (affecting neutral technology via AR(1)), risk premium shocks (altering intertemporal substitution via AR(1)), investment-specific technology shocks (boosting capital efficiency via AR(1)), wage mark-up shocks (distorting labor margins via ARMA(1,1)), price mark-up shocks (impacting goods pricing via ARMA(1,1)), exogenous government spending shocks (influencing aggregate demand via AR(1) with productivity linkages), and monetary policy shocks (deviations from Taylor rules via AR(1)). Orthogonality assumes uncorrelated innovations, facilitating variance decomposition where, for instance, risk shocks can account for approximately 60% of U.S. output variance in models with financial frictions. Early real business cycle variants emphasized technology shocks as the dominant force, positing efficient responses to productivity innovations explain most aggregate variability. Modern New Keynesian extensions incorporate nominal and financial frictions alongside demand-side shocks to better replicate data features like inflation persistence and countercyclical markups, though debates persist on shock identification and the relative roles of supply versus demand disturbances. Empirical estimation, often Bayesian, calibrates shock parameters to match second moments of observables, revealing monetary policy shocks induce hump-shaped output responses while productivity shocks generate prolonged expansions.

Dynamic General Equilibrium Framework

The dynamic general equilibrium framework in DSGE models describes an economy's evolution over time as a sequence of allocations and prices that satisfy agents' intertemporal optimization conditions while ensuring all markets clear in every period, accounting for stochastic disturbances. This extends static general equilibrium theory—where supply equals demand simultaneously across markets at a single point—by incorporating forward-looking behavior, where current decisions influence future states through capital accumulation, habit formation, or other state variables. Equilibrium paths are thus Pareto optimal in expectation, derived from decentralized decisions under rational expectations, without requiring a social planner. Central to this framework are the first-order conditions from household and firm optimization, which yield Euler equations linking consumption or output growth to interest rates and expected future variables, alongside transversality conditions ensuring finite present-value debts. For instance, a representative household maximizes expected lifetime utility \mathbb{E}_0 \sum_{t=0}^\infty \beta^t u(c_t, 1 - n_t), subject to budget constraints involving stochastic income or productivity shocks, leading to intratemporal labor supply conditions equating marginal rates of substitution to real wages. Firms, often modeled with monopolistic competition, set prices or quantities to maximize profits under production functions like y_t = a_t k_t^\alpha n_t^{1-\alpha}, where a_t follows a stochastic process. Market clearing requires aggregate demand to equal supply for goods, labor, and capital each period, closing the model. These elements ensure the framework captures causal linkages, such as how productivity shocks propagate through investment decisions to affect long-run growth paths. In equilibrium, the system's nonlinear dynamics are typically analyzed via log-linear approximations around a deterministic steady state, where variables grow at constant rates absent shocks, facilitating computation of impulse responses and welfare comparisons. This approximation preserves the general equilibrium consistency, as deviations from steady state reflect shock-driven fluctuations, with policy interventions evaluated against counterfactuals that maintain optimality and clearing conditions. Empirical implementations, such as those used by central banks, embed nominal rigidities (e.g., Calvo pricing) while preserving the underlying dynamic equilibrium structure, though debates persist on whether such frictions distort causal inference from microfoundations. The framework's rigor stems from its requirement that all endogenous variables—output, inflation, interest rates—emerge jointly from primitive shocks and parameters, avoiding ad hoc aggregates.

Model Components and Methods

Household and Firm Optimization

In dynamic stochastic general equilibrium (DSGE) models, households are typically represented by a continuum of identical agents who maximize expected lifetime utility over consumption, leisure, and possibly other variables such as housing or financial assets. The utility function often takes a separable form, such as U = E_0 \sum_{t=0}^\infty \beta^t \left[ \frac{C_t^{1-\sigma}}{1-\sigma} - \frac{N_t^{1+\phi}}{1+\phi} \right], where C_t denotes consumption, N_t labor supply, \beta < 1 the discount factor, \sigma > 0 the inverse intertemporal , and \phi > 0 the inverse Frisch elasticity of labor supply; this setup derives from constant relative risk aversion (CRRA) preferences and ensures tractable intertemporal substitution under uncertainty. Households face a incorporating , profits from firms, from , transfers or taxes, and borrowing/saving via bonds, with governed by K_{t+1} = (1-\delta) K_t + I_t, where I_t is and \delta ; optimization yields Euler equations linking marginal utilities across periods, such as u_c(C_t) = \beta E_t \left[ u_c(C_{t+1}) (1 + r_{t+1} - \delta) \right], reflecting of future returns r_{t+1}. Firms in DSGE frameworks optimize expected profits subject to production technologies incorporating shocks, often distinguishing between competitive producers and monopolistically competitive final goods sectors in New Keynesian variants. Representative firms maximize \max E_t \sum_{s=0}^\infty \beta^s \left[ P_{t+s} Y_{t+s} - MC_{t+s} Y_{t+s} \right] (adjusted for price stickiness via Calvo or Rotemberg mechanisms), where Y_t = A_t K_t^\alpha N_t^{1-\alpha} follows a Cobb-Douglas with shock A_t following \log A_t = \rho_a \log A_{t-1} + \epsilon_t, \alpha capital share, and MC_t derived from factor prices; implies factor demands w_t = MC_t (1-\alpha) Y_t / N_t for wages and r_t^k = MC_t \alpha Y_t / K_t for capital rentals. In real cycle (RBC) foundations, firms operate under with flexible prices, equating prices to s instantaneously, whereas extensions introduce nominal rigidities where a of firms cannot adjust prices, leading to dynamic markup adjustments \mu_t = P_t / MC_t > 1. These microfoundations ensure general equilibrium consistency by deriving aggregate dynamics from decentralized decisions under , though empirical calibration often reveals tensions with observed heterogeneity in agent behavior.

Market Clearing and Equilibrium Conditions

In dynamic stochastic general equilibrium (DSGE) models, the equilibrium is defined as a set of endogenous variables and prices that satisfy the first-order optimality conditions derived from households' and firms' maximization problems, the evolution of exogenous stochastic processes, and market-clearing conditions ensuring that supply equals across all markets in each period. These conditions collectively characterize the model's paths, often approximated via log-linearization around a steady state to facilitate computation and analysis. Market-clearing conditions form the backbone of the general equilibrium framework, imposing feasibility constraints that aggregate individual decisions into economy-wide consistency. In a standard closed-economy representative-agent DSGE model, the goods market clears when total output equals total absorption: Y_t = C_t + I_t + G_t, where Y_t denotes aggregate , C_t private consumption, I_t , and G_t . The labor market clears analogously, equating firms' aggregate labor demand—derived from marginal products—with households' labor supply, often N_t^s = N_t^d = L_t, where N represents and L labor input. Capital market clearing, where relevant, balances households' capital holdings with firms' demands, such as K_t = \sum_f K_{f,t}, ensuring no excess supply of productive assets. In open-economy or multi-sector extensions, additional clearing conditions apply, such as for the or sector-specific goods, where net exports adjust to balance trade: NX_t = Y_t - C_t - I_t - G_t. Financial asset markets clear through zero net supply of bonds or , consistent with households' constraints and no-Ponzi conditions, preventing opportunities under . These conditions hold period-by-period, even amid shocks, reflecting the Walrasian implicit in the models' competitive structure. New Keynesian variants of DSGE models retain quantity market clearing but incorporate nominal rigidities, such as sticky prices or wages, which prevent full price adjustment and thus equilibrium at flexible-price levels; quantities still equate , but output gaps emerge due to distortionary markups or adjustment costs. Estimation and solution methods, like Bayesian approaches in Smets-Wouters frameworks, enforce these conditions to match empirical moments, ensuring the model's implied equilibria align with observed data under parameter uncertainty. Deviations from clearing, if modeled via frictions, are explicitly parameterized rather than assumed away, maintaining microfounded consistency.

Solution and Estimation Techniques

Dynamic stochastic general equilibrium (DSGE) models are typically solved by approximating the nonlinear system of Euler equations and market-clearing conditions around the model's . The most widely used approach is perturbation methods, which employ expansions to derive local approximations of agents' policy functions. First-order perturbations, equivalent to log-linearization, linearize the model and yield a linear system solvable via methods such as the Blanchard-Kahn conditions or generalized eigenvalue decompositions, facilitating analytical solutions for unconditional moments and impulse responses. Higher-order perturbations, such as second- or third-order expansions, incorporate nonlinear effects like risk premia or asymmetry in responses, improving accuracy for welfare analysis or stochastic simulations, though they increase computational demands. These methods, formalized in works like Aruoba, Fernández-Villaverde, and Rubio-Ramírez (2006), rely on the to compute derivatives and are implemented in software like Dynare or toolboxes. For models with significant nonlinearities or occasionally binding constraints, global solution techniques—such as projection methods (e.g., Chebyshev polynomials) or endogenous methods—provide more accurate but computationally intensive approximations by solving over the entire . Estimation of DSGE model parameters involves matching model-implied moments or likelihoods to macroeconomic data, such as output growth, inflation, and interest rates. Classical methods use (MLE), where the likelihood is evaluated via the on the obtained from the log-linearized model, allowing inference on structural parameters like intertemporal elasticities or shock volatilities. Bayesian estimation has become predominant since the mid-2000s, incorporating prior distributions (e.g., informative priors from microeconomic evidence or diffuse for deep parameters) and computing posteriors via (MCMC) algorithms, as in the Smets-Wouters framework applied to U.S. and area data. This approach addresses parameter identification issues and quantifies uncertainty, though it requires careful prior elicitation to avoid . Limited-information methods, such as (GMM) or indirect inference, estimate subsets of parameters by matching impulse responses or second moments from vector autoregressions (VARs) to model-generated counterparts, offering robustness to misspecification but less efficiency than full-information approaches. Calibration remains a complementary technique for parameters with limited data , setting values based on long-run targets or micro studies, while focuses on behavioral and parameters. Empirical applications, like those by Christiano, Eichenbaum, and Evans (2005), demonstrate how these techniques enable model via marginal likelihoods or posterior odds in Bayesian setups.

Applications in Policy and Analysis

Monetary Policy Simulation

Dynamic stochastic general equilibrium (DSGE) models simulate by embedding reaction functions, typically Taylor rules of the form i_t = r^* + \pi^* + \alpha (\pi_t - \pi^*) + \beta (y_t - y^*), where i_t is the , r^* the equilibrium real rate, \pi^* the inflation target, y_t - y^* the , and parameters \alpha > 1, \beta \geq 0 ensure stability under the Taylor principle. These rules link policy rates to deviations in inflation and output, allowing the model to generate functions to monetary shocks or policy shifts. Simulations solve the via around the steady state, tracing variable paths over horizons of quarters to years. Central banks utilize DSGE simulations to evaluate policy transmission and trade-offs. For example, the Reserve's model, a semi-structural DSGE framework, assesses monetary tightening's effects on and GDP amid supply shocks, as in analyses of post-2008 quantitative easing transitions. Simulations reveal that forward guidance in low-rate environments amplifies effects but risks attenuation if credibility wanes, with optimal policy balancing stabilization against output volatility. The European Central Bank's NAWM model similarly simulates euro-area responses to hikes, projecting declines of 1-2 percentage points within two years under calibrated rules. Comparative simulations test alternative rules, such as nominal GDP targeting versus rules, finding the former reduces volatility in inflation and output gaps by 10-20% in calibrated New Keynesian DSGE setups with financial frictions. Bayesian estimation refines parameters using data from 1980 onward, enabling counterfactuals like the 2022 U.S. rate hikes' simulated dampening of demand-driven inflation by 0.5-1% annually. These exercises inform communication, as DSGE-derived projections underpin statements on policy paths.

Fiscal Policy and Business Cycle Decomposition

In dynamic stochastic general equilibrium (DSGE) models, is represented through exogenous shocks to and taxes, which influence via the resource constraint Y_t = C_t + I_t + G_t and distort labor supply through tax wedges on wages and capital income. The government intertemporal ensures debt sustainability, typically enforced via future tax adjustments, leading to under complete markets and , where financed spending crowds out private consumption without net stimulus. Distortionary taxation, however, generates deadweight losses, reducing multipliers below one in real variants, though New Keynesian extensions with sticky prices and wages can yield multipliers up to 1.5 for temporary spending increases during liquidity traps. Business cycle decomposition in DSGE frameworks relies on Bayesian estimation to identify structural shocks and quantify their contributions to macroeconomic fluctuations via variance decompositions and historical decompositions. Variance decompositions partition the variance of variables like output into shares attributable to fiscal shocks versus , monetary, or markup shocks; for instance, in a medium-scale model of the U.S. economy calibrated to 1966–2004 data, exogenous shocks explain over 50% of output variance within one year, declining to under 20% at longer horizons as productivity shocks dominate. Fiscal shocks to transfers or -financed spending contribute minimally to variance (<5% at frequencies) but up to 43% to debt ratio fluctuations, highlighting their role in fiscal dynamics over output stabilization. Historical decompositions extend this by tracing cumulative shock impacts over specific episodes, such as attributing portions of U.S. recessions to negative fiscal shocks amplifying demand contractions. In extensions incorporating fiscal backing—where partially offsets debt issuance via —estimated backing parameters around 0.83 imply fiscal shocks propagate through and output with amplified persistence, explaining episodes like 1970s partly as fiscal-led. These tools inform by isolating causal channels, though model relies on priors and restrictions, with fiscal shocks often secondary to supply or demand factors in postwar U.S. cycles.

Central Bank and International Uses

Central banks worldwide integrate dynamic stochastic general equilibrium (DSGE) models into their core frameworks for design, , and scenario analysis, leveraging the models' ability to incorporate microeconomic foundations and stochastic shocks for coherent policy simulations. A survey indicates that approximately 80% of DSGE policy models are developed in-house by central banks, reflecting their central role in institutional modeling efforts. These models enable policymakers to quantify the transmission of interest rate changes, evaluate unconventional tools like , and decompose fluctuations into supply, demand, and monetary shocks. In the United States, the System maintains several DSGE models tailored for forecasting and policy evaluation. The of New York's DSGE model, operational since the mid-2000s, supports quarterly economic projections by simulating responses to identified shocks such as productivity or disturbances. The Chicago Fed's DSGE model, updated to version 2 in 2023, aids in analyzing inflation dynamics and output gaps under alternative policy rules. Similarly, the Fed's model, detailed in a 2024 technical report, incorporates regime-switching mechanisms to assess non-linear effects in post-pandemic environments. These tools complement larger semi-structural models like FRB/US but provide rigorous microfounded alternatives for counterfactual exercises. The (ECB) employs DSGE models as part of a multi-model suite for area projections and strategy reviews. The New Area Wide Model (NAWM), refined iteratively since its inception around , uses Bayesian estimation to forecast GDP, , and interest rates, incorporating area-specific features like labor market rigidities and fiscal interactions. In assessments such as the 2023 review of transmission mechanisms, NAWM II alongside semi-structural models evaluates the impact of rate hikes on lending and , highlighting channels like bank funding costs. Other ECB-affiliated models extend to open-economy settings for spillover analysis across member states. Beyond national institutions, international organizations like the (IMF) deploy DSGE models for cross-border policy advice and global coordination. The IMF's Global Integrated Monetary and Fiscal (GIMF) model, a multi-country framework calibrated since the early 2000s, simulates fiscal-monetary interactions and trade spillovers, informing World Economic Outlook scenarios as of 2022 updates. Specialized variants, such as those incorporating foreign exchange interventions in small open economies, support country-specific recommendations, including for emerging markets facing capital flow volatility. The IMF also trains policymakers on DSGE applications for integrated analysis, emphasizing their utility in quantifying policy trade-offs amid external shocks. Additional central banks, such as the (ToTEM model, introduced 2010) and (BEQM), use comparable DSGE setups for domestic cycle decomposition and international linkage studies.

Empirical Performance

Forecasting and Predictive Power

Dynamic stochastic general equilibrium (DSGE) models are routinely used by central banks for macroeconomic forecasting, with empirical assessments indicating that their unconditional predictive accuracy for variables such as GDP growth, , and interest rates is broadly comparable to that of (VAR) and autoregressive benchmarks over short to medium horizons. For instance, error (RMSE) metrics from studies spanning 1984–2007 show DSGE RMSEs for one-quarter-ahead GDP growth ranging from 0.45% to 0.66%, from 0.18% to 0.32%, and interest rates from 0.11% to 0.21%, often matching or slightly exceeding autoregressive (AR(2)) performance at horizons up to four quarters. Conditional forecasts, incorporating external nowcasts like survey data, further enhance short-term precision, reducing RMSEs for output growth from approximately 0.58% to 0.43% at one quarter ahead in Smets-Wouters-style models. Despite these relative strengths, DSGE forecasts exhibit limited absolute predictive power, with low explanatory R² values (often near zero for beyond one quarter ahead) akin to other macroeconomic models during the period (1992–2004), performing only marginally better than naive or constant benchmarks. Standard pre-2008 DSGE frameworks notably failed to anticipate the global financial crisis, as they largely omitted financial frictions, banking sector vulnerabilities, and leverage dynamics, leading to overly optimistic projections of stable growth and low volatility. Post-2008 evaluations reveal mixed improvements from model augmentations; for example, New York Federal Reserve DSGE variants incorporating financial frictions (SWFF) produced output growth forecasts during the 2011–2016 recovery with RMSEs competitive to Blue Chip consensus surveys (0.2%–1.0% across one- to eight-quarter horizons) and outperforming median projections at longer horizons, while forecasts aligned closely with Survey of Professional Forecasters results. Nonetheless, these models struggled with regime shifts like the , where judgmental adjustments in projections adapted more effectively than purely model-driven outputs. Overall, while DSGE models provide structural coherence for , their empirical edge diminishes for rare tail events, underscoring persistent challenges in capturing nonlinearities and exogenous shocks beyond baseline calibrations.

Business Cycle Accounting

Business cycle accounting (BCA) is a quantitative framework introduced by V. V. Chari, Patrick J. Kehoe, and Ellen R. McGrattan in 2002 and formalized in their 2007 paper, designed to decompose observed macroeconomic fluctuations into contributions from potential shocks and frictions without committing to a fully specified structural model. The method starts with a real business cycle (RBC) model featuring standard neoclassical production, optimization, and , then introduces four "wedges" as deviations from the model's equations: a wedge capturing shocks, a labor wedge reflecting distortions to labor supply or demand (such as taxes or bargaining frictions), an intertemporal wedge distorting the Euler equation for consumption-savings decisions, and an investment wedge affecting efficiency. These wedges are backed out from U.S. data by solving the model to match observed aggregates like output, hours, consumption, investment, and government spending, typically over postwar samples (e.g., 1954–2004) or historical episodes like the (1929–1939). In applications to postwar U.S. data, BCA attributes about 50–70% of output variance to productivity shocks, with the remainder linked to labor and intertemporal wedges, suggesting that real shocks dominate but non-technology factors amplify cycles. Empirically, BCA has been extended to international data and specific crises, revealing cross-country variations in wedge contributions; for instance, in emerging economies, labor wedges often explain more of the variance than in the U.S., pointing to institutional differences in labor markets. For the 2008–2009 recession, updated BCA decompositions indicate that and intertemporal wedges—potentially proxying financial frictions or shocks—accounted for up to 40% of the output drop, beyond standard measures, supporting the of such mechanisms into DSGE models. Proponents argue BCA's value lies in its discipline: it identifies which frictions must be modeled to match data, aiding ; for example, reproducing labor wedge often requires sticky wages or search frictions rather than pure RBC assumptions. However, the method's performance hinges on auxiliary assumptions like linear detrending or steady-state calibration, where deviations (e.g., using Hodrick-Prescott filters) can shift wedge contributions dramatically, sometimes inverting conclusions about shock dominance. Critics, notably Lawrence J. Christiano and Jesper Lindé, highlight two core flaws undermining BCA's reliability for DSGE validation: first, the intertemporal wedge's shocks propagate through channels (like altering interest rates or ) that mimic nominal rigidities or financial constraints, leading to misattribution; second, the method's sensitivity to measurement choices—such as depreciation rates or data filtering—can overturn findings, with small tweaks reallocating 20–50% of explained variance from one wedge to another. Chari et al. counter that proper implementation, including robustness checks across specifications, confirms productivity's centrality, and reinterpretations of wedges as aggregate shocks align with RBC parsimony over frictions. Overall, while BCA has illuminated the limits of pure technology-driven DSGE models—showing wedges explain 80–90% of fluctuations when flexibly estimated—it has not resolved debates on causal primacy, as wedge identification remains non-unique without deeper structural restrictions, prompting approaches in modern macro analysis.

Econometric Validation and Tests

DSGE models undergo econometric validation through a combination of estimation techniques and diagnostic tests that assess parameter identifiability, model fit, and specification adequacy. Bayesian estimation dominates due to its ability to incorporate prior information and handle uncertainty, deriving the posterior distribution \pi(\theta | y_T) via and sampling it with (MCMC) methods like the Metropolis-Hastings algorithm, often using random walk proposals to achieve acceptance rates of 0.23–0.30 after burn-in periods of around 10,000 draws. Priors are selected based on economic theory, such as distributions for Calvo price stickiness parameters (e.g., \zeta_p \sim \Beta(0.6, 0.2)) or inverse gamma for shock standard deviations. The , evaluated using the for linear Gaussian approximations or particle filters for nonlinear cases, relies on state-space representations to filter observables like output growth and from U.S. quarterly spanning 1959–2007. Validation emphasizes posterior predictive , which generate discrepancy statistics—such as or variance decompositions—from simulated under the posterior and compare their distributions to those from observed , often yielding p-values to flag improbabilities. For instance, in the Smets-Wouters New Keynesian DSGE model estimated on euro area , the observed between and growth (around 0.5) falls in the tails of the predictive distribution (p < 1%), indicating misspecification in or . These account for and sampling , outperforming point-estimate comparisons by pinpointing policy-irrelevant versus relevant failures, such as over-reliance on demand during recessions where model variances exceed observed levels by 20–60%. Classical approaches complement Bayesian methods, including maximum likelihood (ML) estimation via Kalman filtering, augmented with measurement errors to incorporate multiple observables and avoid singularity issues, tested for residual autocorrelation and root mean squared errors. Moment-based techniques like (GMM) and simulated method of moments (SMM) minimize distances between empirical moments (e.g., variances, covariances from bivariate data) and model-implied ones, validated through overidentification tests such as Hansen's J-statistic, which follows a under correct specification. Indirect inference extends this by matching auxiliary parameters, like those from a first-order (VAR), between data and simulations, also employing chi-squared overidentification tests; efficiency improves with longer simulated series (e.g., 20 times the sample length). Impulse response matching estimators align model responses to structural VAR shocks, providing identification-robust validation for propagation mechanisms. Model comparison and specification testing further scrutinize DSGE frameworks, using Bayes factors from marginal likelihoods (computed via modified harmonic mean or importance sampling) to weigh evidence, as in cases where a restricted model yields a log marginal likelihood of -38.69 versus -39.49 for an unrestricted one, favoring the former weakly. Likelihood ratio tests assess nested models, while targeted specification tests decompose failures into equilibrium restrictions, stochastic processes, or solution errors using bootstrap or simulation-based p-values. Empirically, these tests often highlight persistent challenges, such as inadequate replication of asset price moments or business cycle asymmetries, underscoring the need for auxiliary assumptions like measurement errors despite theoretical microfoundations.

Criticisms and Debates

Methodological and Assumption-Based Critiques

Critics contend that the rational expectations assumption central to DSGE models, whereby agents form expectations consistent with the model's probabilistic structure, fails to capture real-world uncertainty and diverse beliefs, as evidenced by surveys of economic forecasters showing systematic disagreements rather than convergence to model-consistent predictions. This assumption, formalized in Lucas (1976), is seen as a mathematical convenience that precludes explanations of asset bubbles, such as the pre-2008 U.S. housing boom, where irrational exuberance and heterogeneous expectations played key roles. Empirical tests, including those replacing rational expectations with adaptive learning, often yield better out-of-sample fits, suggesting the hypothesis lacks robust support. The representative agent framework, which aggregates household and firm behavior into a single optimizing entity, overlooks distributional effects and heterogeneity, rendering models unable to address phenomena like where borrower-lender asymmetries amplify shocks. Under this setup, individual debts net to zero at the level, ignoring how credit constraints on subsets of agents—such as low-wealth —propagate and alter macroeconomic outcomes, as demonstrated in studies of U.S. data from 1960–2010. The Sonnenschein-Mantel-Debreu further undermines this by proving that, absent strong restrictions, excess functions can mimic arbitrary behavior, severing the link between and reliable macro predictions. Assumptions of continuous and rapid restoration conflict with observed persistent deviations, such as elevated rates post-2008 that lingered for years without self-correction, challenging the model's depiction of economies as always near steady-state paths. Methodologically, DSGE's "equilibrium discipline"—requiring all variables to satisfy optimality conditions—imposes undue restrictions that exclude non-equilibrium dynamics like or financial accelerators, which links to crisis amplification. , often preferred over full structural estimation due to failures, relies on ad hoc matching of moments to , yielding parameters sensitive to arbitrary choices rather than falsifiable tests, as critiqued in analyses of New Keynesian variants. The insistence on deriving macro relations solely from intertemporal optimization, ostensibly to evade the , paradoxically enforces a narrow that dismisses emergent behaviors not traceable to , such as herd effects or institutional feedbacks. This dogma, prevalent since the , prioritizes theoretical consistency over empirical adequacy, with models explaining at most 20% of variance in key macro series like output growth. Critics argue this approach embodies a form of in academic , where assumption violations are patched via frictions rather than reconsidered fundamentally.

Empirical Failures and Post-Crisis Assessments

Dynamic stochastic general equilibrium (DSGE) models exhibited significant empirical shortcomings during the , primarily due to their neglect of key financial sector dynamics. Pre-crisis DSGE frameworks failed to anticipate the crisis because they did not adequately account for the rapid expansion of the shadow banking system and associated leverage buildup, which rendered the U.S. economy vulnerable to systemic shocks. These models assumed frictionless credit markets and representative agents with , precluding phenomena such as , asymmetric information, and endogenous debt accumulation that amplified the collapse into a broader downturn. Consequently, DSGE models could not generate predictions of the housing bubble's endogenous origins or the severe deteriorations that propagated and fire sales. In terms of shock amplification and persistence, standard DSGE models underestimated the role of financial accelerators, treating recessions as primarily driven by exogenous technology or demand shocks rather than credit constraints and leverage cycles. Empirical evidence from the highlighted this gap, as postwar data prior to showed limited ties between financial disruptions and output drops, leading modelers to assign modest weights to financial frictions in baseline specifications like Bernanke, Gertler, and Gilchrist (1999). The models' linear approximations further exacerbated misspecification, rendering them ill-suited for capturing nonlinear tail events or regime shifts, such as the on interest rates, which prolonged the downturn beyond what equilibrium assumptions predicted. Forecasting performance assessments post-crisis revealed additional weaknesses. Estimated DSGE models, such as Smets-Wouters (2007), produced errors (RMSEs) for and that were competitive with Bayesian vector autoregressions (BVARs) during the (1992–2004) but deteriorated sharply during the , lagging professional surveys like Blue Chip due to unmodeled regime changes in and financial conditions. Overall absolute accuracy remained poor across methods, aligning with the models' theoretical emphasis on unforecastable transitory shocks under , yet this underscored their limited utility for anticipating structural breaks like the 2008 leverage unwind. Post-crisis evaluations, including those from the Journal of Economic Perspectives, confirmed that DSGE models' core reliance on and exogenous shocks contributed to their empirical fragility in crisis regimes, where endogenous financial instabilities dominated. Critics noted that while extensions incorporating financial frictions emerged after , pre-crisis benchmarks systematically underperformed in matching data variances, covariances, and rare events, prompting broader scrutiny of their data-generating process assumptions. These assessments highlighted a disconnect between DSGE's microfounded focus and real-world disequilibria, with against key components like uncovered further eroding confidence in their empirical robustness.

Heterodox and Alternative Viewpoints

Heterodox economists, particularly from Post-Keynesian, Austrian, and institutionalist traditions, contend that DSGE models embody flawed neoclassical foundations that abstract away from essential features of real economies, such as fundamental , financial fragility, and historical contingency. These critiques emphasize that DSGE's reliance on , representative agents, and continuous imposes an ideological commitment to outcomes that precludes analysis of persistent disequilibria or endogenous crises. For instance, Post-Keynesian scholars argue that DSGE frameworks travesty Keynesian insights by assuming agents possess probabilistic knowledge of the future, thereby neglecting true where outcomes are non-ergodic and unknowable in advance. Steve Keen has specifically highlighted DSGE's inability to incorporate private debt dynamics, a causal factor in the , because the models' prohibit aggregate inconsistencies like rising leverage leading to systemic instability. Keen demonstrates through simulations that standard DSGE consumption equations, such as those in Smets-Wouters models, fail to replicate observed debt-deflation spirals observed in historical downturns, as they enforce Walrasian clearing that neutralizes balance-sheet effects. Post-Keynesians advocate alternatives like stock-flow consistent (SFC) models, which track sectoral balance sheets and demonstrate how monetary endogeneity and drive fluctuations without presuming optimizing equilibrium. Austrian economists criticize DSGE for its mathematical formalism, which they view as incompatible with praxeological analysis of purposeful in a complex, non-equilibrium order. DSGE's shocks and general solutions overlook malinvestment patterns arising from artificial credit expansion, central to , rendering the models ill-suited for causal explanations of booms and busts. Instead, Austrians favor qualitative narratives over quantitative simulations, arguing that DSGE's aggregation assumes away entrepreneurial discovery and heterogeneous capital structures that amplify intertemporal distortions. Broader heterodox assessments, including from evolutionary and perspectives, fault DSGE for and institutional lock-in, where policy prescriptions reinforce a paradigm that underperformed empirically during the by omitting banking sector feedback loops. These views, often sidelined in mainstream journals due to methodological gatekeeping, prioritize agent-based models or empirical accounting that reveal DSGE's predictive shortfalls, such as underestimating crisis severity in pre-2008 calibrations.

Responses, Reforms, and Recent Advances

Defenses and Empirical Justifications

Dynamic stochastic general equilibrium (DSGE) models have been defended for their capacity to quantitatively replicate core stylized facts, providing empirical grounding for their microfounded structure. Foundational real business cycle variants, as developed by Kydland and Prescott, generate simulated moments aligning closely with U.S. data, including output , the relative standard deviation of investment (approximately twice that of output), (with lower volatility than output), high persistence in GDP fluctuations (autocorrelation near 0.9), and procyclical comovements in and . These matches, achieved through to external estimates of technology shock processes, justified the shift toward equilibrium models driven by supply-side disturbances rather than demand deficiencies. New Keynesian extensions incorporate nominal frictions and estimated parameters via Bayesian methods, enhancing fit to a broader set of observables. The Smets-Wouters medium-scale model, estimated on U.S. quarterly series (output growth, , wages, interest rates, , , and hours) from 1964 to 2004, yields high marginal data densities, indicating superior explanatory power relative to restricted benchmarks lacking key frictions like price/wage stickiness, habit persistence, and variable capital utilization; posterior odds ratios confirm these features' empirical relevance. Similarly, Christiano, Eichenbaum, and Evans' framework reproduces hump-shaped output responses and delayed peaks following identified monetary shocks, matching structural (SVAR) evidence from 1959 to 2003. Proponents emphasize DSGE's structural invariance to policy regimes, enabling counterfactual simulations immune to the , unlike atheoretical reduced-form approaches. Variance decompositions further validate the paradigm: extensions attribute roughly 60% of U.S. fluctuations (1955–2007) to shocks affecting intertemporal , with and monetary disturbances explaining additional shares, consistent with long-run restrictions and micro-level on firm-level . Forecasting evaluations bolster these claims; the New York Fed's DSGE model delivered root-mean-square errors for output growth and inflation below medians during 2009–2015, outperforming in low-interest-rate environments through forward-looking dynamics. Central bank implementations, such as those at the ECB and , rely on DSGE for scenario analysis precisely due to this blend of theoretical discipline and data-congruent predictions in non-crisis states. While statistical fit metrics like likelihood sometimes favor unrestricted VARs, defenders prioritize DSGE's interpretive power and avoidance of , as hybrid DSGE-VARs with moderate theory weights (e.g., λ=0.5) balance the two.

Incorporation of Financial Frictions and Heterogeneity

Following the , DSGE models were extended to include financial frictions, such as balance-sheet constraints and external finance premia, to account for the amplification of shocks through and credit channels. These extensions, often building on the financial accelerator framework where firms' borrowing costs rise with deteriorating , improved the models' ability to replicate the depth and persistence of recessions observed in the data. For instance, estimated DSGE models with financial frictions attributed a significant portion of the U.S. output decline during the crisis to disruptions in financial intermediation, with leverage dynamics explaining up to 40% of investment drops in some calibrations. Heterogeneity in agent characteristics, particularly wealth and income distributions, has been incorporated into DSGE frameworks via Heterogeneous Agent New Keynesian (HANK) models, which relax the representative-agent assumption to capture uninsurable idiosyncratic risks and varying marginal propensities to consume. Unlike standard RANK models, HANK variants reveal that monetary policy transmission operates more through indirect effects—such as changes in asset prices affecting hand-to-mouth households—rather than direct interest-rate channels, altering policy multipliers by factors of 2-3 in response to aggregate demand shocks. These models, estimated using micro-level panel data, demonstrate superior fit for distributional outcomes, with heterogeneity amplifying fiscal multipliers during liquidity traps. Recent integrations combine financial frictions with agent heterogeneity, enabling analysis of how constraints interact with to propagate crises and influence policy design. For example, state-dependent frictions in HANK-DSGE setups, estimated via methods, show that post-2020 low-interest environments exacerbate through uneven access to , with high- households facing amplified shocks. Such models, applied to episodes like the , highlight the need for targeted fiscal interventions to mitigate heterogeneous impacts, outperforming homogeneous-friction benchmarks in forecasting recovery paths.

Developments Beyond Standard Assumptions (2020-2025)

Recent research has increasingly focused on relaxing the full information (FIRE) assumption in DSGE models, incorporating , , and heterogeneous expectations to better capture empirical dynamics. A 2025 survey by Levine et al. documents these extensions, emphasizing statistical learning mechanisms like recursive least-squares and , which allow agents to update beliefs based on observed data rather than perfect foresight. Such models demonstrate improved within-sample fit for New Keynesian frameworks, particularly in explaining persistent deviations from equilibrium observed in post-2008 and COVID-era data. Adaptive learning variants, where agents form expectations via constant-gain or decreasing-gain algorithms, have shown superior performance in Bayesian estimation of medium-scale DSGE models. For instance, Warne (2023) estimates a Smets-Wouters-style model for the euro area (2001Q1–2019Q4) and finds yields a log gain of approximately 22 units over within-sample, attributing this to greater persistence in beliefs that aligns with observed and output gaps. However, out-of-sample forecasts reveal trade-offs, with edging out in short-term GDP growth predictions while excels in longer-horizon . These findings underscore adaptive learning's role in addressing over-optimism in standard RE models during volatile periods. Heterogeneous expectations further extend these frameworks by allowing subpopulations of agents—such as sophisticated rational agents alongside learners using or k-level thinking—to coexist, leading to emergent equilibria with amplified volatility. Hommes et al. (2023) develop a New Keynesian DSGE with such heterogeneity, showing that belief diversity generates sunspot-driven fluctuations consistent with survey data on forecasters. Similarly, extensions to imperfect models with heterogeneous agents, as in Levine et al. (2023), incorporate Kalman filtering for signal , improving empirical matching to U.S. data from 1984–2008 and suggesting robust monetary policies must account for frictions. To handle rare events like the , non-linear DSGE extensions introduce "unusual shocks" that load onto multiple wedges (e.g., labor, investment) with time-varying persistence informed by professional forecasts. Barnichon et al. (2024) apply this to a standard medium-scale model, estimating that unanticipated components of the COVID shock explained most of the 2020Q2 output drop and inflationary pressures, enabling seamless integration of pre- and post-pandemic data without regime switches. This approach preserves while accommodating non-Gaussian dynamics, with the shock acting as a persistent drag on activity through 2021. Empirical validation across U.S. and euro area series confirms its flexibility for policy scenario analysis.

References

  1. [1]
    [PDF] On DSGE Models - Northwestern University
    “Wholesale Banking and Bank. Runs in Macroeconomic Modeling of Financial. Crises.” Chap. ... “Shocks and Frictions in US Business Cycles: A. Baysian DSGE Approach ...
  2. [2]
    [PDF] On DSGE Models - National Bureau of Economic Research
    Dynamic Stochastic General Equilibrium (DSGE) models are the leading framework that macroeconomists have for dealing with this challenge in an open and ...
  3. [3]
    How Useful are Estimated DSGE Model Forecasts?
    DSGE models are a prominent tool for forecasting at central banks and the competitive forecasting performance of these models relative to ...
  4. [4]
    [PDF] Solution and Estimation Methods for DSGE Models
    This paper provides an overview of solution and estimation techniques for dynamic stochastic general equilibrium (DSGE) models. We cover the foundations of ...
  5. [5]
    [PDF] Where Modern Macroeconomics Went Wrong
    A major criticism of standard DSGE models provided by Hendry and Muellbauer (2018) is that they ignore these constraints and assume “cash and other liquid ...
  6. [6]
    [PDF] Estimation and Evaluation of DSGE Models: Progress and Challenges
    Most DSGE models impose strict balanced growth path restrictions implying, for instance, that consumption-output, investment-output, government spending- ...<|separator|>
  7. [7]
    The Standard Economic Paradigm is Based on Bad Modeling
    Mar 8, 2021 · This is no ado about nothing—most [DSGE] models fail to coherently explain up to 80% of key macroeconomic variables.” As Paul Krugman (2016) ...
  8. [8]
    [PDF] Policy Analysis Using DSGE Models: An Introduction
    “Macroeconomic Modeling for Monetary Policy ... “Shocks and Frictions in U.S. Business Cycles: A Bayesian DSGE Approach.” American Economic Review 97, no.
  9. [9]
    [PDF] On DSGE Models - American Economic Association
    “Wholesale Banking and Bank. Runs in Macroeconomic Modeling of Financial. Crises.” Chap. ... “Shocks and Frictions in US Business Cycles: A. Baysian DSGE Approach ...
  10. [10]
    [PDF] Stata Dynamic Stochastic General Equilibrium Models Reference ...
    Apr 26, 2024 · In DSGE terminology, a model solution expresses the control variables as a function of the state vari- · ables alone and expresses the state ...
  11. [11]
    [PDF] 9. The Real Business Cycle Model and DSGE Modelling - Karl Whelan
    The Real Business Cycle (RBC) model assumes only 'real' shocks affect GDP, and monetary policy has no impact. It is the original DSGE model.
  12. [12]
    [PDF] General Introduction - Olivier Loisel
    Nov 19, 2024 · Like RBC models, NK models are DSGE models. As such, they are ... Main differences between RBC and NK models. Unlike in RBC models, in ...
  13. [13]
  14. [14]
    Time to Build and Aggregate Fluctuations - jstor
    Further,. Page 22. 1366 F. E. KYDLAND AND E. C. PRESCOTT some technological change may be embodied in new capital, and only after the capital becomes productive ...
  15. [15]
    Evolution of Modern Business Cycle Models: Accounting for the ...
    Modern business cycle theory focuses on the study of dynamic stochastic general equilibrium (DSGE) models that generate aggregate fluctuations similar to those ...
  16. [16]
  17. [17]
    [PDF] An essay on the history of DSGE models - arXiv
    Feb 8, 2025 · Dynamic Stochastic General Equilibrium (DSGE) models are nowadats the standard theoretical framework for quantitative policy-making analyses.
  18. [18]
    Speculations on the stabilization and dissemination of the “DSGE ...
    Apr 3, 2017 · According to JSTOR, it was Robert King and Charles Plosser who, in their famous 1984 paper titled Real Business Cycles, used the term DSGE for ...
  19. [19]
  20. [20]
  21. [21]
    [PDF] Finn Kydland and Edward Prescott's Contribution to Dynamic ...
    Oct 11, 2004 · Kydland and Prescott's 1982 paper transformed macroeconomic analysis in several di- mensions. Indeed, it provided a blueprint for rendering ...
  22. [22]
    [PDF] DSGE Models and the Lucas Critique. A Historical Appraisal.
    This interpretation is rooted in Kydland and Prescott (1977)—thought this work precedes the development of RBC models (Kydland and Prescott,. 1982)—where the ...
  23. [23]
    With inflation front and center, work that launched “rational ...
    Aug 29, 2022 · In 1961, economist John Muth published his academic proposal that models should feature firms forming expectations in a "rational" way—not just ...Missing: critique | Show results with:critique
  24. [24]
    Robert E. Lucas Jr., Nobel laureate and pioneering economist, 1937 ...
    May 16, 2023 · Though theories about rational expectations had previously existed, Lucas' pioneering work applied the theory to the economy as a whole.
  25. [25]
    [PDF] Expectational Data in DSGE Models - UC Irvine
    Expectational data, from surveys, is used to test DSGE models, identify misspecification, and as an observable variable to match estimations.
  26. [26]
    [PDF] Shocks and frictions in US business cycles: a Bayesian DSGE ...
    The model incorporates many types of real and nominal frictions and seven types of structural shocks. We show that this model is able to compete with Bayesian ...
  27. [27]
    [PDF] The Chicago Fed DSGE Model
    This last disturbance and the monetary policy shock are white noise, while the others follow mutually independent AR(1) processes. For estimation, we ...
  28. [28]
    [PDF] Lecture 3 Dynamic General equilibrium Models
    Dynamic General equilibrium Models. 1. Introduction. In macroeconomics, we ... The DGE model with uncertainty is known as Dynamic Stochastic General Equilibrium.
  29. [29]
    [PDF] DYNAMIC STOCHASTIC GENERAL-EQUILIBRIUM MODELS OF ...
    This analysis shows the main issues in moving to dynamic models of price-setting and illustrates the list of ingredients to choose from, but it does not ...
  30. [30]
    [PDF] Lecture 2 Dynamic stochastic general equilibrium (DSGE) models
    Jul 18, 2015 · DSGE models combine firm and household behavior, define equilibrium, and are quantified numerically. Firms maximize profits, households ...
  31. [31]
    [PDF] An estimated stochastic dynamic general equilibrium model of the ...
    In this paper we, first, develop a stochastic dynamic general equilibrium (SDGE) model for the euro area, which features a number of frictions that appear to ...
  32. [32]
    [PDF] DSGE Models: Practical Methodological Note and Recent Trends
    Apr 25, 2025 · Macroeconomic modeling has evolved significantly, with two main approaches emerging as pillars of economic analysis: DSGE (Dynamic ...
  33. [33]
    [PDF] DSGE Models for Monetary Policy∗ - European Central Bank
    Oct 28, 2009 · Thus, households would be induced to substitute away from a hump shape response, towards one in which the immediate response is much stronger.
  34. [34]
    [PDF] Chapter 7 - DSGE Models for Monetary Policy Analysis
    3 discusses the representative household's capital accumulation decision, and Section 4.2.4 states the representative household's optimization problem. 4.2 ...
  35. [35]
    [PDF] Optimal Monetary Policy in an Operational Medium-Sized DSGE ...
    The model economy consists of households, domestic goods firms, importing consumption and importing investment firms, exporting firms, a government, a central ...
  36. [36]
    [PDF] Documentation of the Estimated, Dynamic, Optimization-based ...
    This paper contains documentation for the large-scale estimated DSGE model of the U.S. economy currently used at the Federal Reserve Board for some forecasting.
  37. [37]
    [PDF] DSGE-Modelling: when agents are imperfectly informed
    DSGE models assume agents fully understand the model, but this paper relaxes this, using heuristics, and compares it to rational expectations.
  38. [38]
    [PDF] NBER WORKING PAPER SERIES DSGE MODELS IN A DATA ...
    We show that exploiting more information is important for accurate estimation of the model's concepts and shocks, and that it implies different conclusions ...
  39. [39]
    [PDF] The FRBNY DSGE Model - Federal Reserve Bank of New York
    Moreover, the clear specification of the stochastic shocks allows one to identify the source of economic fluctuations. The fact that DSGE modelers can readily ...<|separator|>
  40. [40]
    [PDF] Policy Analysis Using DSGE Models: An Introduction
    One of the fundamental features of DSGE models is the dynamic interaction between the [three interrelated] blocks— hence, the “dynamic” aspect of the DSGE ...
  41. [41]
    [PDF] technology shocks in a two-sector DSGE model
    market clearing conditions for labor and capital. Equations (A11) and (A12) are the market clearing conditions for C-sector goods and I-sector goods.
  42. [42]
    [PDF] Working Paper No. 279 - Trends and Cycles in Small Open Economies
    The partial equilibrium version of this model omits the world goods market clearing condition and adds a stochastic process to capture the evolution of the ...
  43. [43]
    [PDF] The Chicago Fed DSGE Model: Version 2
    Sep 26, 2023 · The Chicago Fed DSGE model is used for policy analysis and forecasting, describing its specification, estimation, and dynamic characteristics. ...
  44. [44]
    [PDF] New Keynesian DSGE models - Jonathan Benchimol
    What is a DSGE model ? ▻ DSGE models are dynamic, stochastic, and characterize the general equilibrium of the economy. ▻ ...Missing: terminology | Show results with:terminology
  45. [45]
    [PDF] An Estimated Dynamic Stochastic General Equilibrium Model of the ...
    Abstract. This paper develops and estimates a dynamic stochastic general equilibrium (DSGE) model with sticky prices and wages for the euro area.
  46. [46]
    [PDF] Perturbation and Projection Methods for Solving DSGE Models
    Perturbation method uses Taylor series expansion (computed using implicit function theorem) to approximate model solution. – Advantage: can implement procedure ...
  47. [47]
    Perturbations in DSGE models: An odd derivatives theorem
    One popular way is with perturbation methods, which are quite fast and yet maintain reasonable accuracy (Aruoba, Fernandez-Villaverde, Rubio-Ramirez, 2006 ...
  48. [48]
    [PDF] Bayesian Estimation of DSGE Models
    Feb 2, 2012 · Abstract. We survey Bayesian methods for estimating dynamic stochastic general equilibrium (DSGE) models in this article.
  49. [49]
    [PDF] MA Advanced Macroeconomics: 10. Estimating DSGE Models
    So Kalman filter provides a way to do maximum likelihood estimation of. DSGE models that mix observable and unobservable variables. You may have found the ...
  50. [50]
    [PDF] Solving and Estimating Dynamic General Equilibrium Models Using ...
    A log-linearization strategy for solving DSGE models. • Estimation. – Putting ... • Limited information methods. – Matching model impulse response functions with ...
  51. [51]
    [PDF] Formulating and Estimating DSGE Models: A Practical Guide
    Oct 11, 2021 · Applying Bayesian estimation to DSGE models requires us to define a prior p(θ) and a likelihood function p(O|θ). We also add a series of ...
  52. [52]
    [PDF] DSGE Models for Monetary Policy Analysis Lawrence J. Christiano ...
    This section analyzes versions of the standard Calvo-sticky price New Keynesian model with- out capital. In practice, the analysis of the standard New Keynesian ...
  53. [53]
    [PDF] BIS Working Papers - No 258 - DSGE models and central banks
    For instance, the Federal Reserve Board's SIGMA model has been used to analyse the impact of a wide variety of shocks such as those arising from monetary policy ...
  54. [54]
    Optimal Monetary Policy in a DSGE Model with Attenuated Forward ...
    Oct 19, 2018 · In this article, we explore the implications of attenuating the power of forward guidance for the optimal conduct of forward guidance policy in a quantitative ...
  55. [55]
    Nominal GDP growth targeting vs. Taylor rules in a model with ...
    Simulations with their model show that volatility of both the inflation rate and the output gap can be reduced more by a strict NGDP growth targeting rule than ...
  56. [56]
    [PDF] A Narrative Approach to a Fiscal DSGE Model
    A necessary condition for the VAR and DSGE models to agree on the shocks affecting the economy is for both models to be able to span the same economic shocks.
  57. [57]
    [PDF] Fiscal backing, inflation and US business cycles
    Jan 25, 2024 · ... business cycle are also visible in the historical decompositions ... Panel b of Table. 2 contains the variance decomposition of this model.
  58. [58]
    Evaluating Historical Episodes using Shock Decompositions in the ...
    Mar 7, 2025 · Figure 2 illustrates the evolution of the shock decompositions over time using the shock decomposition formula in (6), where the shock ...
  59. [59]
    [PDF] On the sources of business cycles: implications for DSGE models
    The variance decompositions are an extremely useful tool and support our argument for model mis- specification checks. As our discussion above indicates, the ...Missing: fiscal | Show results with:fiscal
  60. [60]
  61. [61]
    [PDF] DSGE Models Used by Policymakers: A Survey
    Oct 2, 2020 · First, we find that close to 80% of the. DSGE policy models are developed by central banks, which shows that central banks are the main users of ...
  62. [62]
    [PDF] How Useful Are Estimated DSGE Model Forecasts for Central ...
    ABSTRACT Dynamic stochastic general equilibrium (DSGE) models are a prominent tool for forecasting at central banks, and the competitive.
  63. [63]
    The FRBNY DSGE Model - FEDERAL RESERVE BANK of NEW YORK
    Dynamic stochastic general equilibrium (DGSE) models have grown in importance at many of the world's central banks as tools to inform economic forecasting ...
  64. [64]
    The Chicago Fed DSGE Model: Version 2
    The Chicago Fed dynamic stochastic general equilibrium (DSGE) model is used for policy analysis and forecasting at the Federal Reserve Bank of Chicago.
  65. [65]
    [PDF] The St. Louis Fed DSGE Model - ECONOMIC RESEARCH
    This document contains a technical description of the dynamic stochastic general equilib- rium (DSGE) model developed and maintained by the Research ...
  66. [66]
    The Fed - FRB/US Project
    Aug 16, 2022 · FRB/US is a large-scale estimated general equilibrium model of the US economy that was developed at the Federal Reserve Board.
  67. [67]
    [PDF] Forecasting with DSGE models - European Central Bank
    This paper reviews forecasting with DSGE models using Bayesian methods, focusing on predictive distributions and the NAWM model for the ECB.
  68. [68]
    A model-based assessment of the macroeconomic impact of the ...
    For these reasons, this assessment uses a suite of models: two structural DSGE models (NAWM II and MMR) and one large‑scale semi‑structural model (ECB-BASE).
  69. [69]
    DSGE models and their use at the ECB | SERIEs
    Feb 23, 2010 · Bayesian dynamic stochastic general equilibrium (DSGE) models combine microeconomic behavioural foundations with a full-system Bayesian likelihood estimation ...
  70. [70]
    A Medium-Scale DSGE Model for the Integrated Policy Framework
    Jan 28, 2022 · This paper jointly analyzes the optimal conduct of monetary policy, foreign exchange intervention, fiscal policy, macroprudential policy, and capital flow ...
  71. [71]
    An Estimated DSGE Model for Integrated Policy Analysis
    Jun 30, 2023 · We estimate a New Keynesian small open economy model which allows for foreign exchange (FX) market frictions and a potential role for FX interventions.
  72. [72]
    Training Program Course - International Monetary Fund (IMF)
    This course, presented by the Institute for Capacity Development, focuses on building, using, and interpreting DSGE models.
  73. [73]
    [PDF] DSGE Models and Central Banks - EconStor
    2Some central banks that have developed DSGE models are the Bank of Canada (ToTEM), Bank of England (BEQM), Central Bank of Chile (MAS), Central Reserve Bank ...
  74. [74]
    [PDF] DSGE Model-Based Forecasting - Federal Reserve Bank of New York
    The term DSGE model encompasses a broad class of macroeconomic models that spans the standard neoclassical growth model discussed in King, Plosser, and Rebelo ( ...
  75. [75]
    The Economic Model "Crisis" Has Changed - RIETI
    Dec 10, 2018 · This failure was mainly due to the fact that DSGE models at the time pretty much ignored the financial system. However, Professor Christiano ...
  76. [76]
    [PDF] DSGE Forecasts of the Lost Recovery
    The post-recession recovery was challenging, with a persistent output gap and no deflation. The NY Fed DSGE model's output growth forecasts were better than ...
  77. [77]
    [PDF] Accounting for Business Cycles
    Our business cycle accounting method is intended to shed light on promising classes of mechanisms through which primitive shocks lead to economic fluctuations. ...
  78. [78]
    American business cycles 1889–1913: An accounting approach
    Business Cycle Accounting decomposes economic fluctuations into their contributing factors. The results suggest that both the 1890s and the 1907 recessions ...
  79. [79]
    [PDF] Output falls and the international transmission of crises
    Apr 21, 2021 · Chari, V., Kehoe, P., & McGrattan, E. (2007). Business Cycle Accounting. Econo- metrica, 75, 781–836. 27. Page 30. Chari, V. V., Kehoe, P. J. ...
  80. [80]
    Business cycle accounting: What have we learned so far? - Brinca
    Aug 16, 2023 · This paper contributes with a software—a graphical user interface that allows practitioners to perform BCA exercises with minimal effort—and ...Missing: criticisms | Show results with:criticisms
  81. [81]
    Two Flaws In Business Cycle Accounting | NBER
    Oct 31, 2006 · First, small changes in the implementation of BCA overturn CKM's conclusions. Second, one way that shocks to the intertemporal wedge impact on ...Missing: criticisms | Show results with:criticisms
  82. [82]
    [PDF] BUSINESS CYCLE ACCOUNTING
    Business cycle accounting uses time-varying wedges (efficiency, labor, investment, government consumption) to model economic fluctuations, measured and fed ...
  83. [83]
    [PDF] The Econometrics of DSGE Models Jesús Fernández-Villaverde ...
    This paper reviews the formulation and estimation of DSGE models, especially using Bayesian methods, and their role as a cornerstone of modern macroeconomics.
  84. [84]
    [PDF] Posterior Predictive Analysis for Evaluating DSGE Models
    Explicitly Bayesian inference methods are now the norm in DSGE modelling. The methods used are, at a general level, a straightforward application of what we ...
  85. [85]
    [PDF] Methods to Estimate Dynamic Stochastic General Equilibrium Models
    Oct 1, 2002 · The procedures are: 1) Maximum Likelihood (with and without measurement errors and incorporating priors), 2) Generalized Method of Moments, 3) ...
  86. [86]
    [PDF] Impulse Response Matching Estimators for DSGE Models
    Sep 2, 2015 · Abstract. One of the leading methods of estimating the structural parameters of DSGE mod- els is the VAR-based impulse response matching ...
  87. [87]
    [PDF] Targeted Testing of Dynamic Stochastic General Equilibrium Models
    Oct 6, 2023 · We develop targeted specification tests for Dynamic Stochastic General Equilibrium. (DSGE) models, which can separately examine a model's ...
  88. [88]
    Where modern macroeconomics went wrong | Oxford
    Jan 5, 2018 · The first was taken by real business cycle (RBC) theory and its descendant, DSGE, which attempted to reformulate macroeconomics by taking ...
  89. [89]
    [PDF] DSGE model forecasting: rational expectations vs. adaptive learning
    Adaptive learning has better within-sample fit, while rational expectations predicts real GDP growth and inflation better, especially for short-term forecasts. ...
  90. [90]
  91. [91]
    Cordon of Conformity: Why DSGE Models Are Not the Future of ...
    Aug 12, 2021 · Empirical evidence shows that savings rates are higher for the higher-income classes (Taylor Citation2020). As a result, by raising savings ...
  92. [92]
    A Methodological Critique of New Keynesian and DSGE Models
    Jul 17, 2020 · This paper is to offer a critical review of the so-called New Consensus Macroeconomics related to the understanding of the theoretical underpinning in The ...
  93. [93]
    [PDF] The financial crisis and DSGE models. A critical evaluation
    [2009] argue that the failure of economists to anticipate and model the financial crisis has ... nancial frictions, the current benchmark DSGE model fails ...
  94. [94]
    [PDF] How much progress has the mainstream made? Evaluating modern ...
    In contrast, other Post-Keynesians have been rather critical towards DSGE models. King. (2012, p.3) for example calls DSGE models “in fact a travesty of Keynes, ...
  95. [95]
    I'm not Discreet, and Neither is Time - Steve Keen | Substack
    Apr 2, 2024 · For example, the Smets and Wouters DSGE model discussed in Chapter 1 has the following equation to represent consumption (Smets and Wouters 2007 ...
  96. [96]
    Post Keynesian Dynamic Stochastic General Equilibrium Theory
    Jan 26, 2017 · This paper explains the common elements between these seemingly disparate traditions. I make the case for unity between Post-Keynesian and ...Missing: criticism | Show results with:criticism
  97. [97]
    [PDF] The Austrian School and Mathe- matics: Reconsidering Methods in ...
    In mainstream economics, dynamic stochastic general equilibrium. (DSGE) models are the standard forecasting and policy analysis tool. These models apply general ...
  98. [98]
    What's right and wrong with Austrian macro? - Eli Dourado
    Aug 23, 2011 · This critique of DSGE-style macro is part of the core of Austrian theory. Furthermore, in the Austrian view, capital is heterogeneous and multi ...
  99. [99]
    DSGE models — a macroeconomic dead end
    Dec 11, 2022 · First of all, DSGE models have substantial difficulties in taking account of many important mechanisms that actually govern real economies, for ...<|control11|><|separator|>
  100. [100]
    Rethinking Macroeconomic Theory Before the Next Crisis
    Sep 23, 2016 · The standard DSGE model, designed by New Keynesian economists, has illustrated 'how Panglossian even New Keynesian economics had become'. Figure ...
  101. [101]
    [PDF] Real Business Cycle Models: Past, Present, and Future*
    Kydland and Prescott (1982) find that simulated data from their model show the same patterns of volatility, persistence, and comovement as are present in. U.S. ...
  102. [102]
    Shocks and Frictions in US Business Cycles: A Bayesian DSGE ...
    Using a Bayesian likelihood approach, we estimate a dynamic stochastic general equilibrium model for the US economy using seven macroeconomic time series.Missing: fit | Show results with:fit
  103. [103]
  104. [104]
    In defence of central bank DSGE modelling
    Mar 14, 2018 · For example, among the most prominent forerunners in central banks' modelling was a DSGE model featuring a role for financial intermediation and ...
  105. [105]
  106. [106]
    The role of financial frictions during the crisis: An estimated DSGE ...
    This paper provides a quantitative assessment of the impact of financial frictions on the US business cycle.Missing: incorporation | Show results with:incorporation
  107. [107]
    The empirical performance of the financial accelerator since 2008
    The reason is that in the estimated model with financial frictions, the drastic post-2008 collapse of investment causes firms' leverage to decline. Taking the ...
  108. [108]
    Understanding Heterogeneous Agent New Keynesian Models
    Feb 24, 2020 · Researchers have developed Heterogeneous Agent New Keynesian (HANK) models that incorporate heterogeneity and uninsurable idiosyncratic risk ...
  109. [109]
    [PDF] Fiscal and Monetary Policy with Heterogeneous Agents
    These Heterogeneous-Agent New Keynesian (“HANK”) models feature new transmission channels and allow for the joint study of aggregate and distributional effects.
  110. [110]
    Estimating linearized heterogeneous agent models using panel data
    With the rise of “HANK” (heterogeneous agent New Keynesian) models (Kaplan et al., 2018), heterogeneous agent models are now competing with standard DSGE models ...
  111. [111]
    [PDF] Risk and State-Dependent Financial Frictions - Bank of Canada
    Aug 9, 2022 · We also contribute to the literature that has analyzed developments in DSGE models before and after the 2008 financial crisis. Christiano et al.
  112. [112]
    [PDF] Estimating Nonlinear Heterogeneous Agent Models with Neural ...
    May 19, 2024 · Using simulated data, we show that the proposed method accurately estimates the parameters of a nonlinear Heterogeneous Agent New Keynesian ( ...
  113. [113]
    Recent Developments in DSGE Modelling: Beyond FIRE
    Jun 30, 2025 · The fourth section moves on to models with heterogeneous agents consisting of both RE and non-RE agents and examines a class of equilibria when ...
  114. [114]
    [PDF] Recent Developments in DSGE Modelling: Beyond FIRE
    The fourth section moves on to models with heterogeneous agents consisting of both RE and non-RE agents and examines a class of equilibria when the latter can ...
  115. [115]
    Unusual shocks in our usual models - ScienceDirect
    We propose a methodology to incorporate unusual shocks into our usual models and use it to study the COVID recession and recovery.