Fact-checked by Grok 2 weeks ago

Modeling and simulation

Modeling and simulation is a computational that involves developing abstract representations, known as models, of real-world systems or processes to predict and analyze their under various conditions through simulated experiments. These models can be mathematical, logical, or physical abstractions that mimic the dynamics of complex entities, such as systems, biological processes, or economic scenarios, allowing researchers to test hypotheses, optimize designs, and evaluate outcomes without the risks or expenses of real-world trials. At its core, refers to the execution of these models with specific inputs to generate outputs that reflect potential system responses, often using computer software to handle elements like or . The process of modeling and simulation typically encompasses several key stages: problem formulation to define objectives and scope, model construction using equations, algorithms, or data-driven approaches, to ensure accuracy against real-world , experimentation to explore scenarios, and interpretation of results for . Models vary in type, including deterministic ones that produce fixed outcomes from given inputs and stochastic models incorporating probability to account for variability; they can also be static (time-independent) or dynamic (evolving over time), and either continuous or discrete-event based on how changes occur. Validation is critical, involving statistical tests and analyses to confirm that the model faithfully represents the system's essential features from the intended perspective. In practice, modeling and simulation finds widespread application across disciplines, particularly in and , where it supports the and evaluation of complex systems like , manufacturing lines, and materials under extreme conditions. For instance, in , high-fidelity simulations enable the verification of performance, training in environments, and subsystem analyses such as or structural loads, ensuring and efficiency in . In broader scientific contexts, it facilitates optimization in fields like for supply-chain management and environmental modeling for climate predictions, often integrating advanced computing techniques to handle large-scale and interactions. This interdisciplinary tool has evolved with computational advancements, becoming indispensable for innovation in an era of increasingly intricate technologies. As of 2025, this evolution includes the convergence of with , accelerating innovation through faster exploration and improved predictive accuracy.

Fundamentals

Definitions of modeling and simulation

Modeling refers to the process of creating abstract representations of , processes, or phenomena to facilitate understanding, , or control of their behavior. These representations can take various forms, such as physical replicas, mathematical equations, or logical diagrams, simplifying complex realities while capturing essential features. Simulation, in contrast, involves the execution or manipulation of a model over time to observe and analyze the system's dynamic behavior under specified conditions, often using computational tools to mimic real-world scenarios. This process allows for the of "what-if" situations without direct with the actual system, enabling safer and more efficient experimentation. Key characteristics of models include varying levels of , from high-level conceptual descriptions of entities and relationships to detailed mathematical formulations and computational implementations that incorporate specific assumptions about the 's structure and behavior. refers to the degree of accuracy and realism in the , ranging from low- models that prioritize and speed for broad insights to high- ones that incorporate intricate details for precise replication. Models often rely on assumptions that simplify reality, such as treating as black-box (input-output relations without internal details) or white-box (explicit internal mechanisms). Models serve multiple purposes: descriptive (to explain current states), predictive (to forecast future outcomes), and prescriptive (to recommend optimal actions). The term "model" derives from the Latin modulus, meaning "a small measure" or standard unit, evolving through Old Italian modello and Middle French modelle to denote a or scaled representation by the late . Similarly, "simulation" originates from the Latin simulare, meaning "to imitate" or "make like," with roots in similis ("like" or "similar"), entering English in the 14th century via to signify or feigned resemblance. A classic example of modeling is the construction of a of an , which abstracts the aircraft's and for analysis. Simulation then applies this model in a test, where airflow is directed over it to evaluate performance under varying conditions, revealing behavioral insights without flying the full-scale .

Relationship between modeling and simulation

Modeling serves as the foundational static representation of a real-world , capturing its essential entities, processes, and interactions through abstractions such as mathematical equations or logical structures, which in turn enable as the dynamic execution of that representation to mimic over time. Without a well-defined model, simulation lacks the structured framework needed to generate meaningful predictions, as the model provides the rules and parameters that govern how inputs translate into outputs during simulated scenarios. The relationship between modeling and simulation is inherently iterative, forming a cycle that begins with model creation based on system knowledge, followed by simulation runs to test hypotheses, analysis of results to identify discrepancies, and subsequent model refinement to improve alignment with observed reality. This process repeats across phases of system development, allowing for progressive enhancement of both the model's representational accuracy and the simulation's predictive reliability, often involving validation steps to ensure the model's assumptions hold under varied conditions. While modeling emphasizes the accuracy of the system's abstract representation—focusing on capturing structural and functional fidelity—simulation extends this by prioritizing behavioral prediction and performance evaluation under specific input scenarios and temporal dynamics, highlighting their complementary scopes. Key linking concepts include model credibility, established through verification (ensuring the model is built correctly) and validation (confirming it reflects the real system), and simulation fidelity, which measures the degree to which the simulation replicates real-world details and responses. These factors bridge the gap, as higher credibility in the model directly enhances the fidelity of simulation outcomes, enabling trustworthy inferences about system performance. A conceptual representation of this linkage appears in the basic equation for simulation output: y(t) = f(x(t), \theta) where y(t) denotes the system's output at time t, x(t) the input at time t, and \theta the model parameters defining the function f, illustrating how the static model (f and \theta) drives dynamic simulation results.

History

Early developments

The earliest precursors to modern modeling and simulation emerged in ancient civilizations, where physical representations and rudimentary computational aids were used to predict astronomical phenomena. In ancient during the Old Babylonian period (c. 2000–1600 BCE), astronomers employed clay tablets inscribed with models and systematic observations to forecast celestial movements, enabling predictions of planetary positions and eclipses; systematic ephemerides developed later in the BCE. These artifacts represent one of the first documented uses of scalable models to abstract and forecast complex natural systems, laying foundational principles for empirical simulation in astronomy. A significant advancement in analog simulation occurred in with the , dated to approximately 100 BCE, which functioned as a mechanical device to model the motions of , , and . This geared , recovered from a , incorporated epicyclic to replicate astronomical cycles, including the Metonic and Saros periods, providing predictive outputs for eclipses and calendar alignments. Its complexity highlights early engineering efforts to simulate dynamic systems through physical interconnections, bridging observational data with mechanical representation. In the , computational modeling gained traction with Charles Babbage's , proposed in as a mechanical device for automatically generating mathematical tables of polynomial functions. Designed to eliminate human error in logarithmic and astronomical calculations, the engine used methods and mechanical levers to perform iterative computations, marking a shift toward programmable simulation of numerical processes. Concurrently, naval architecture advanced through William Froude's development of scaled ship hull models in the , which enabled hydrodynamic simulations via towing tank experiments to predict full-scale vessel resistance and propulsion efficiency. Froude's similitude laws allowed these physical models to extrapolate fluid dynamic behaviors, revolutionizing ship design by simulating real-world interactions under controlled conditions. The early saw further progress in analog computing with Vannevar Bush's differential analyzer, completed in 1931 at , which mechanically solved ordinary differential equations central to problems like structural vibrations and electrical networks. This room-sized integrated variables through wheel-and-disc and shaft rotations, simulating continuous dynamic systems far beyond manual capabilities and influencing fields from to . During , (OR) teams applied simulation techniques to optimize military logistics, using mathematical models and probabilistic analyses to forecast supply chain efficiencies and convoy routing. These efforts, often involving scenario-based simulations of resource allocation under uncertainty, reduced shipping losses and improved strategic planning across Allied forces. In parallel, pioneered the in the 1940s for nuclear simulations during the , employing random sampling to model neutron diffusion and criticality in reactions where deterministic solutions were intractable. This probabilistic approach simulated particle behaviors through iterative statistical trials, providing essential insights into reactor design and atomic bomb feasibility. A pivotal theoretical contribution came from in 1948 with the introduction of , which formalized mechanisms in modeling complex systems across biology, engineering, and computation. Wiener's framework emphasized circular causality and self-regulation through loops, enabling simulations of adaptive behaviors in both mechanical servos and living organisms, thus unifying disparate modeling traditions.

Emergence as a discipline

The advent of digital computers in the post-World War II era marked a pivotal shift toward formalizing modeling and simulation as an interdisciplinary , transitioning from analog and manual methods to computationally driven approaches capable of handling complex, large-scale problems. The , delivered to the U.S. Census Bureau in 1951, represented one of the first commercial general-purpose electronic computers, enabling early digital simulations in areas such as and predictive modeling, which laid the groundwork for broader scientific applications. Concurrently, John von Neumann's foundational work on cellular automata during the late 1940s and 1950s provided a theoretical framework for self-reproducing systems and emergent behaviors, influencing computational modeling by demonstrating how simple rules could generate complex simulations; his ideas, initially presented at the 1948 Hixon Symposium and elaborated in posthumously published lectures, underscored the potential of automata for simulating biological and logical processes. These developments, amid the rapid proliferation of computing hardware, addressed the limitations of pre-digital methods and fostered the recognition of simulation as a tool for scientific inquiry beyond wartime applications. In the 1960s and , institutional efforts solidified modeling and simulation's status as a distinct field, with the formation of dedicated organizations and recurring forums for knowledge exchange. The Society for Computer Simulation (), founded in 1952 by John McLeod as the Simulation Councils to promote analog and early digital simulation practices, expanded internationally and evolved into a key hub for interdisciplinary collaboration by the . The inaugural Winter Simulation Conference in 1967, initially focused on using tools like , became a cornerstone event, attracting practitioners from , , and to share advancements and standardize methodologies. NATO's establishment of a Science Committee in 1958 further supported simulation research through funding and collaborative programs, facilitating cross-national efforts in computational modeling for defense and scientific purposes. These initiatives addressed the growing need for shared practices as simulation techniques proliferated across sectors. The 1980s and witnessed explosive growth driven by integrations with and , expanding simulation's scope and scalability. Advances in parallel processing and supercomputers, such as those from Cray Research in the 1980s, enabled multifaceted simulations in physics and engineering, while AI techniques like expert systems began incorporating simulation for decision-making and . The Winter Simulation Conference continued as a premier venue, evolving to cover these integrations and attracting thousands of participants by the . Standardization efforts intensified to manage , culminating in the U.S. Department of Defense's approval of the (HLA) baseline in 1996, which defined a framework for distributed simulations across diverse systems. Entering the 2000s, modeling and simulation gained formal recognition as the "third pillar" of scientific , complementing and experimentation by enabling virtual exploration of complex phenomena unattainable through traditional means. This , emphasized in literature, highlighted simulation's role in testing and across disciplines. IEEE's adoption of the HLA as Standard 1516 in 2000 further entrenched , promoting reusable components amid rapid . Ongoing challenges, including the need for verifiable models and interoperable tools in high-performance environments, drove continued refinements to ensure reliability and in an era of accelerating computational power.

Types of models

Physical and analog models

Physical and analog models are tangible representations of real-world systems, constructed using physical materials to replicate the essential properties and behaviors of the for , testing, and . These models emphasize similarity principles to ensure that the scaled-down version behaves proportionally to the actual system: geometric similarity maintains proportional shapes and dimensions, kinematic similarity preserves motion patterns, and dynamic similarity balances forces such as , , and . Typically built from materials like wood, metal, plastics, or fluids, they allow direct interaction and visualization without relying on computational abstraction, making them particularly useful in disciplines where empirical validation is key. Common types of physical and analog models include structural models, fluid dynamic models, and biomechanical models. Structural models, such as scaled prototypes of , are used to assess load-bearing capacity and deformation under stress; for instance, laboratory-scale bridge girders made from or replicas help predict failure modes before full-scale . Fluid dynamic models, exemplified by setups, involve scaled or vehicle shapes tested in controlled airflow to study aerodynamic forces and drag. Biomechanical models, like anthropomorphic , simulate human body responses to impacts, incorporating sensors to measure forces on skeletal and soft tissues during collision tests. The design of these models relies on scaling laws derived from dimensional analysis to ensure similitude between model and prototype. A key principle is dynamic similarity, achieved by matching dimensionless numbers that govern the physics. For systems involving free-surface flows dominated by gravity, such as ship hulls in waves, the Froude number is critical: Fr = \frac{v}{\sqrt{gL}} where v is the characteristic velocity, g is gravitational acceleration, and L is the characteristic length. This number arises from balancing inertial forces (\rho v^2 L^2) against gravitational forces (\rho g L^3) in the momentum equation, ensuring that wave patterns and resistance scale correctly; velocities in the model are thus adjusted as v_m = v_p \sqrt{\lambda}, where \lambda is the length scale ratio. In ship model testing, maintaining equal Froude numbers allows prediction of full-scale wave-making resistance from tow-tank experiments. Physical and analog models offer intuitive of complex phenomena, enabling hands-on experimentation and qualitative insights that are difficult to obtain from representations, while also providing quantitative data through . However, they are often costly to fabricate and due to material and labor requirements, and scalability is limited by practical constraints like facility size or material availability, restricting them to specific regimes where full cannot always be achieved. A historical example is the ' use of glider models in the early 1900s; they constructed and flew small-scale wooden gliders to aerodynamic and surfaces, iteratively refining designs based on observed flight behaviors before achieving powered flight in 1903.

Mathematical and computational models

Mathematical and computational models abstractly represent complex systems through equations, algorithms, and data structures, enabling analysis without physical prototypes. These models form a structured , beginning with the —a high-level, qualitative description of the system's key elements and relationships—progressing to the , which formalizes these using precise equations and logical relations, and culminating in the model, where the mathematical is translated into executable code or algorithms for . This hierarchy ensures that abstractions remain grounded in real-world phenomena while facilitating iterative refinement. Mathematical models span deterministic approaches, which yield unique outputs for fixed inputs via equations without , to processes that incorporate probabilistic elements to capture and variability in behavior. A prominent type is the model for continuous , expressed as \frac{[dx](/page/DX)}{[dt](/page/DT)} = [f(x](/page/F/X), t), where x denotes the and t time; these are often solved numerically using methods like Euler's , x_{n+1} = x_n + h f(x_n, t_n), with h as the time step, to approximate trajectories over time. Agent-based models, conversely, simulate decentralized by modeling individual agents with autonomous rules and interactions, leading to emergent macroscopic patterns from local behaviors. Central to these models are distinctions between variables, which represent dynamic states that evolve during simulation, and parameters, which are fixed coefficients defining the system's inherent properties. Sensitivity analysis evaluates how variations in parameters or inputs affect outputs, often employing partial derivatives to quantify local impacts, such as \frac{\partial y}{\partial p} for output y and parameter p. These models offer advantages in scalability, allowing simulation of vast, intricate systems beyond physical constraints, and repeatability, as identical inputs produce consistent results for verification. However, they rely on simplifying assumptions that may overlook nuanced real-world complexities, potentially leading to inaccurate predictions if not carefully calibrated. A classic example is the Lotka-Volterra predator-prey model, which uses coupled ordinary differential equations to describe population oscillations in ecological systems: \frac{dx}{dt} = \alpha x - \beta x y \frac{dy}{dt} = \delta x y - \gamma y Here, x(t) and y(t) are prey and predator populations, respectively; \alpha is the prey growth rate, \beta the predation rate, \delta the predator's growth efficiency from consuming prey, and \gamma the predator death rate. Developed independently by Alfred Lotka in 1920 (expanded in 1925) and in 1926, this deterministic framework illustrates cyclic dynamics where prey surges boost predators, which then deplete prey, allowing recovery—providing foundational insights into biological interactions despite idealized assumptions like constant rates.

Simulation methods

Discrete-event simulation

Discrete-event simulation (DES) models systems where changes in state occur only at discrete points in time, known as events, rather than continuously. Between events, the system state remains unchanged, allowing efficient simulation of irregular, event-driven dynamics such as queueing or stochastic processes. This approach is particularly suited for systems with asynchronous occurrences, where time advances in jumps to the next relevant event, ignoring idle periods. The core mechanism of DES relies on event list management, a that stores scheduled future events ordered by their timestamps. The simulation clock advances discontinuously to the time of the earliest event in the list; upon processing, the event updates the system state instantaneously, and any consequent events (e.g., resource releases) are added to the list. This next-event time advance ensures chronological order and computational efficiency, as the clock does not increment during state updates. Key components of DES include entities, which represent dynamic objects like jobs or customers moving through the system; resources, such as servers or machines with limited capacity that process entities; and queues, which hold entities awaiting resource availability. Variability is introduced through , often modeling arrivals as a , where the probability of k arrivals in interval t follows P(k) = \frac{(\lambda t)^k e^{-\lambda t}}{k!}, with \lambda denoting the average arrival rate per unit time. This element captures real-world uncertainty in event timings. The next-event time advance algorithm drives the simulation loop. A basic pseudocode outline is:
Initialize: Set clock t = 0; schedule initial events; initialize event list and system state
While termination condition not met:
    Remove earliest event from event list
    Advance clock: t = event.time
    Execute event routine to update system state
    Schedule any new future events into the event list
End while
For a single-server queue example, the loop can be specialized as follows (adapted for arrivals and completions):
l = 0; t = 0.0; ta = GetArrival(); tc = ∞;
While (ta < τ or l > 0):
    t = min(ta, tc)
    If t == ta:  // Arrival event
        l = l + 1
        ta = GetArrival()
        If ta > τ: ta = ∞
        If l == 1: tc = t + GetService()
    Else:  // Completion event
        l = l - 1
        If l > 0: tc = t + GetService()
        Else: tc = ∞
    End if
End while
Here, l tracks queue length, \tau is the run length, ta and tc are next arrival and completion times, and GetArrival()/GetService() generate exponentially distributed times. DES finds application in manufacturing lines, where it simulates production sequences to identify bottlenecks, and in network traffic, modeling packet flows to evaluate congestion control. A representative example is the single-server queue under the M/M/1 model, with arrivals at rate \lambda and exponential service times at rate \mu > \lambda. The steady-state utilization \rho = \frac{\lambda}{\mu} represents the server's long-run busy fraction, derived from the birth-death balance equations: for state n (customers in system), \lambda p_{n-1} = \mu p_n for n \geq 1, yielding p_n = (1 - \rho) \rho^n, where the server's idle probability p_0 = 1 - \rho implies utilization \rho. This metric guides stability analysis, ensuring \rho < 1 for finite queues.

Continuous and hybrid simulation

Continuous simulation models systems that evolve smoothly over continuous time, typically represented by ordinary differential equations (ODEs) or partial differential equations (PDEs) derived from physical laws. These methods approximate solutions through , advancing the system state in small time steps to capture gradual changes without abrupt events. A prominent technique is the fourth-order Runge-Kutta (RK4) method, which offers high accuracy for non-stiff ODEs of the form \frac{dy}{dt} = f(t, y) by evaluating the at multiple intermediate points within each step. The algorithm proceeds as follows: k_1 = h f(t_n, y_n) k_2 = h f\left(t_n + \frac{h}{2}, y_n + \frac{k_1}{2}\right) k_3 = h f\left(t_n + \frac{h}{2}, y_n + \frac{k_2}{2}\right) k_4 = h f(t_n + h, y_n + k_3) y_{n+1} = y_n + \frac{1}{6}(k_1 + 2k_2 + 2k_3 + k_4) t_{n+1} = t_n + h Here, h is the step size, and the weighted average of the k_i terms provides a fourth-order approximation to the exact solution. This explicit method is favored for its simplicity and local error control, though adaptive step sizing may be incorporated to balance precision and efficiency. Hybrid simulation integrates continuous dynamics with discrete events, enabling the modeling of systems where smooth flows are interrupted by instantaneous transitions, such as threshold crossings or mode switches. In these approaches, continuous integration proceeds via time-stepping until a state-event detection algorithm identifies a discontinuity in the state derivatives, triggering an event handler to update the model parameters or structure. For instance, the event location is pinpointed by interpolating between steps and solving for the zero-crossing of a guard condition, ensuring accurate reinitialization of the ODE solver. This combination is essential for cyber-physical systems like control algorithms embedded in physical plants. Central to continuous and hybrid methods is the distinction between fixed or time-stepping, which uniformly samples time for , and event-based advancement, which only updates at detected changes in hybrid cases. Stiffness arises in equations with disparate timescales, such as \frac{dy}{dt} = -k y where large k demands minuscule steps for in explicit solvers, potentially leading to inefficiency; implicit methods like backward differentiation formulas are often employed to mitigate this by allowing larger steps. These simulations excel in representing physical processes with inherent continuity, such as electrical circuits governed by Kirchhoff's laws or fluid flows via Navier-Stokes equations, providing detailed insights into transient behaviors unattainable through algebraic models alone. However, their computational demands are high, as accuracy requires fine , particularly for PDEs discretized into large systems or in stiff scenarios, often necessitating specialized solvers or . A representative application is the simulation of a (CSTR), where the for reactant concentration C follows from conservation principles: inflow and outflow terms balance the consumption, yielding \frac{dC}{dt} = r(C) - \frac{F}{V} C, with r(C) as the rate function, F the , and V the volume. Integrating this equation numerically predicts steady-state conversion and dynamic responses to perturbations like feed changes.

Applications

In science and engineering

In physics and chemistry, modeling and simulation play pivotal roles in predicting complex phenomena that are difficult or impossible to observe directly. General circulation models (GCMs) are widely used in climate science to simulate atmospheric and oceanic dynamics, solving the Navier-Stokes equations of fluid motion to capture variables such as velocity, pressure, temperature, and density. These equations, expressed as \rho \left( \frac{\partial \mathbf{v}}{\partial t} + \mathbf{v} \cdot \nabla \mathbf{v} \right) = -\nabla p + \mu \nabla^2 \mathbf{v} + \mathbf{f}, form the core of full fluid dynamics simulations in GCMs, enabling forecasts of global climate patterns and responses to greenhouse gas emissions. In quantum chemistry, density functional theory (DFT) facilitates simulations of molecular electronic structures, approximating the many-body Schrödinger equation through electron density functionals to predict properties like energy levels and reaction pathways for systems up to thousands of atoms. In engineering disciplines, simulations support design optimization and performance evaluation under extreme conditions. Finite element analysis (FEA) is a cornerstone method for , discretizing complex geometries into elements to solve partial differential equations governing , including the linear stress-strain relation \sigma = E \epsilon, where \sigma is , E is the of elasticity, and \epsilon is , to assess material deformation and in bridges, vehicles, and machinery. In , flight simulations model aircraft dynamics using six-degree-of-freedom equations integrated with aerodynamic, propulsion, and models, allowing virtual testing of maneuvers, stability, and mission profiles before physical prototypes. Biological and applications leverage simulations for physiological processes and therapeutic design. Pharmacokinetic models, such as the one-compartment model, describe drug , , , and elimination by assuming instantaneous mixing in a single body compartment, governed by the \frac{dC}{dt} = -k C with solution C(t) = C_0 e^{-kt}, where C is concentration, k is the , and C_0 is the initial concentration; this aids in dosing regimens for antibiotics and chemotherapeutics. These simulations extend to hypothesis testing and accelerating scientific discoveries, particularly in high-energy physics. At , simulations using toolkits like model particle interactions in detectors, predicting signal signatures and backgrounds to validate hypotheses, such as the decay channels, which contributed to its 2012 discovery by enabling precise comparison of simulated events with experimental data. Recent advances integrate to enhance simulation fidelity in . Post-2020 developments, including AlphaFold's predictions, have revolutionized molecular simulations by achieving near-experimental accuracy in folding trajectories, facilitating of drug candidates against targets like kinases and enabling faster identification of novel inhibitors.

In social sciences and business

In social sciences and , modeling and simulation address the complexities of , economic systems, and organizational dynamics, often incorporating elements to handle and test policy interventions. These approaches enable researchers and practitioners to explore emergent phenomena, such as market crashes or social contagions, without real-world experimentation, providing insights into decision-making under incomplete information. Agent-based models (ABMs), for instance, simulate interactions among heterogeneous agents to reveal macro-level patterns from micro-level rules, while equation-based models like (DSGE) frameworks capture equilibrium dynamics in economies. In , ABMs have been pivotal for studying behaviors, particularly through the Institute's foundational work on emergent properties in artificial stock markets. Developed in the , the Artificial Stock Market simulates traders with adaptive expectations and learning algorithms, demonstrating how heterogeneous beliefs lead to stylized facts like fat-tailed price distributions and , without assuming . This approach contrasts with traditional equilibrium models by emphasizing out-of-equilibrium dynamics and has influenced analyses of financial instability. Complementing ABMs, DSGE models in the New Keynesian framework integrate nominal rigidities and stochastic shocks to evaluate . A core component is the production function Y_t = A_t K_t^\alpha L_t^{1-\alpha}, where Y_t is output, A_t , K_t capital, L_t labor, and \alpha the capital share; this Cobb-Douglas form underpins the aggregate supply relation in full New Keynesian setups, allowing simulations of business cycles and policy responses like interest rate rules. Seminal implementations, such as those incorporating habit formation and adjustment costs, have been estimated using Bayesian methods to forecast and output gaps. Social sciences leverage simulation for modeling human interactions and societal processes, with epidemiological models exemplifying the simulation of disease spread amid behavioral uncertainties. The susceptible-infected-recovered (SIR) model, introduced by Kermack and McKendrick, divides a into compartments and derives equations to predict trajectories. The key equations are: \frac{dS}{dt} = -\beta S I, \quad \frac{dI}{dt} = \beta S I - \gamma I, \quad \frac{dR}{dt} = \gamma I, where S, I, and R are the proportions susceptible, infected, and recovered; \beta the rate; and \gamma the rate. Derived from mass-action principles assuming homogeneous mixing, these equations yield the R_0 = \beta / \gamma, enabling simulations to assess intervention efficacy, such as vaccination thresholds where requires $1 - 1/R_0 coverage. Extensions incorporate and behavioral responses, informing policies during outbreaks. In business contexts, simulations optimize operational flows and quantify risks in uncertain environments. models, often using , replicate , , and fluctuations to identify bottlenecks and test resilience strategies, such as just-in-time versus safety-stock policies. For instance, multi-agent frameworks simulate supplier-buyer interactions to minimize costs while accounting for disruptions like delays, achieving up to 20% efficiency gains in calibrated scenarios. employs methods to propagate uncertainties through probabilistic inputs, generating distributions of outcomes like project costs or portfolio losses; techniques, including and , enhance precision by focusing simulations on rare events, reducing computational variance by orders of magnitude compared to naive sampling. These tools support decision-making in volatile markets, from hedging derivatives to stress-testing corporate strategies. Notable applications include simulations of the , where ABMs replicated spirals and effects among banks and investors, highlighting how amplification led to systemic failures despite individual . Agent-based reconstructions showed that interconnected balance sheets amplified shocks, informing post-crisis regulations like capital buffers. In urban planning, cellular automata (CA) models simulate land-use evolution as grid-based transitions driven by neighborhood rules, predicting sprawl patterns; for example, constrained CA variants incorporate and transport accessibility to forecast sustainable growth, with validations showing over 80% accuracy in historical fits for cities like . Ethical concerns in these simulations have intensified by 2025, particularly biases in models incorporating for social forecasting, where underrepresented demographics—such as older women or ethnic minorities—lead to skewed predictions in labor market or policy simulations. Studies reveal that training data imbalances perpetuate , resulting in higher error rates for marginalized groups in hiring or projections, underscoring the need for diverse datasets and fairness audits to mitigate discriminatory outcomes.

Tools and methodologies

Simulation software and languages

Simulation software provides the computational infrastructure for implementing mathematical and computational models, enabling the execution of simulations across diverse applications. General-purpose tools like , developed by , offer extensive capabilities for numerical computing and , while its companion product specializes in graphical modeling and simulation of continuous dynamical systems using block diagrams to represent multi-domain physical components such as electrical, mechanical, and hydraulic systems. 's integration with allows for seamless algorithm development and deployment, making it a staple in simulations where differential equations govern system behavior. In the open-source domain, has emerged as a versatile alternative, supported by libraries tailored to specific simulation needs. , a core scientific computing package, includes advanced solvers for ordinary differential equations (ODEs), such as the solve_ivp function based on methods like LSODA and Radau, facilitating the of continuous models in fields like physics and . Complementing this, is a process-based library for (), allowing users to model systems through asynchronous processes, resources, and events without low-level threading management, ideal for queueing and simulations. Specialized commercial software addresses domain-specific challenges with integrated environments. AnyLogic supports multimethod and hybrid simulations by combining DES, agent-based modeling, and system dynamics in a single platform, enabling users to experiment with complex interactions in supply chains, healthcare, and urban planning through Java-based extensibility and 3D visualization. Arena, from Rockwell Automation, focuses on DES for manufacturing and business processes, featuring drag-and-drop modules for modeling production flows, material handling, and statistical analysis to optimize throughput and reduce bottlenecks. Modeling languages standardize the representation of systems prior to implementation in . The (UML), maintained by the (OMG), facilitates conceptual modeling through diagrams like , class, and sequence views, serving as a bridge to translate high-level designs into simulatable artifacts. SysML extends UML for , incorporating parametric diagrams for constraint-based modeling and allocation tables to link requirements with behavioral and structural elements, widely used in aerospace and automotive industries. For physical systems, is a non-proprietary, object-oriented language that employs acausal, equation-based modeling, where users declare relationships like f_1(x) = 0, f_2(x) = 0 without prescribing solution variables, supporting multi-domain simulations via tools like . The latest Modelica Standard Library (version 4.0.1, released May 2025) supports the Modelica language version 3.6, with tools like OpenModelica 1.24.0 (Q4 2024) and 2025x providing advanced simulation capabilities. Contemporary trends in emphasize scalability and accessibility through and open-source ecosystems. AWS SimSpace Weaver, introduced in 2022 but with end of support on May 20, 2026 (no new customers accepted after May 20, 2025), was a managed service for orchestrating large-scale, real-time spatial simulations in the cloud, integrating game engines like for city-scale modeling while handling partitioning and synchronization automatically. As of 2025, trends include AI-powered modeling for automated generation and optimization, surrogate models for accelerated computations, digital twins with integration, and enhanced multimethod approaches. Following 2020, the proliferation of open-source tools has accelerated, with frameworks like Pyomo for optimization-based simulations and community-driven extensions to gaining traction due to cost-effectiveness and collaborative development, as evidenced by increased adoption in academic and industrial research. Selecting appropriate simulation software involves evaluating factors aligned with project demands. Ease of use is critical for intuitive interfaces that minimize learning curves, such as graphical editors in or , allowing non-programmers to build models quickly. Scalability ensures handling of model complexity, from small prototypes to enterprise-level computations, often via or distributed architectures in cloud platforms. Integration with external data sources, including real-time APIs for sensor inputs or databases, enhances model fidelity and supports data-driven simulations, as seen in Python's ecosystem compatibility with tools like .

Verification, validation, and accreditation

Verification ensures that a model or simulation implementation and its associated accurately represent the developer's conceptual description and specifications. Validation determines the degree to which a model or and its associated represent the real world from the perspective of the intended uses. Accreditation certifies that a model or and its associated are acceptable for a specific purpose or use. Common methods for include face validation, statistical tests, and . Face validation involves subject matter experts reviewing the model's logic and output behavior for reasonableness. Statistical tests, such as the goodness-of-fit test, compare observed simulation outputs to expected real-world distributions, computed as \chi^2 = \sum \frac{(O_i - E_i)^2}{E_i}, where O_i are observed frequencies and E_i are expected frequencies. examines how changes in input parameters or model assumptions affect outputs to identify critical factors and assess robustness. Standards guide these processes, including the U.S. Department of Defense (DoD) VV&A framework, first established in DoD Instruction 5000.61 in 1996 and updated in subsequent revisions such as the 2024 version, which prescribes policies and procedures for VV&A in military modeling and simulation. The IEEE 1012 standard, most recently revised in 2024, provides processes for system, software, and hardware verification and validation throughout the life cycle. Challenges in VV&A include , particularly in parameter estimation, where Bayesian approaches update probability distributions of model parameters based on observed data to propagate uncertainties through simulations. An example workflow in military contexts uses analogs, where experts attempt to distinguish real system data from outputs to confirm behavioral fidelity.

Education and professional resources

Academic programs

Formal education in modeling and simulation (M&S) encompasses undergraduate, graduate, and doctoral programs that integrate computational, mathematical, and domain-specific principles to prepare students for interdisciplinary careers. At the bachelor's level, (ODU) pioneered the first undergraduate major in modeling and simulation engineering within its in program, emphasizing practical applications in and virtual environments. Graduate programs, such as the and in Modeling, Virtual Environments, and Simulation (MOVES) at the (NPS), build on this foundation by offering advanced training tailored to defense and operational needs, with the MS typically spanning eight quarters and the PhD requiring a prior master's in a related field. Curricula in these programs core on foundational topics including probability and statistics for uncertainty modeling, programming for simulation implementation, and systems theory for holistic analysis, often progressing to electives in specialized domains such as healthcare simulation or visual analytics. For instance, the NPS MOVES MS curriculum covers fundamentals of M&S, data analysis, visual simulation, and intelligent systems, while ODU's MS in Modeling and Simulation Engineering includes an overview of M&S methodologies and domain-specific applications like engineering systems. These programs prioritize hands-on projects to develop skills in discrete-event, continuous, and hybrid simulations, ensuring graduates can address real-world complexities. Professional certifications complement degree programs by validating expertise for industry roles. The Certified Modeling & Simulation Professional (CMSP), introduced in 2002 by the National Training & Simulation Association (NTSA), requires candidates to demonstrate education, experience, and proficiency through examination, covering M&S , standards, and applications. It remains the primary U.S. credential for M&S professionals. Globally, M&S education has expanded since 2010, with the U.S. hosting established programs at institutions like the and NPS, Europe featuring offerings at () focused on computational modeling, and Asia including initiatives at in for multiscale analysis and . This growth includes online options, such as Arizona State University's MS in Modeling and Simulation, which has increased accessibility for working professionals since its launch in the mid-2010s. Industry ties enhance these academic programs through co-ops and partnerships, particularly with for aerospace simulations and defense contractors for secure modeling projects, fostering practical experience in high-stakes environments. As of 2025, programs are addressing skills gaps in integration for M&S, with collaborations emphasizing for predictive simulations in defense and space applications to meet evolving demands. Recent developments include expanded -focused curricula at institutions like NPS, incorporating modules for advanced simulation as of November 2025.

Modeling and Simulation Body of Knowledge

The Modeling and Simulation Body of Knowledge (MSBK) represents a structured initiative by the to consolidate and standardize the essential knowledge, concepts, and practices within the modeling and simulation (M&S) discipline. Launched in as part of efforts to unify the field and support , the MSBK organizes M&S knowledge into interconnected domains, including the model lifecycle, simulation techniques, and interdisciplinary applications. This framework serves as a foundational for practitioners, educators, and researchers, promoting consistency in terminology, methodologies, and competencies across diverse sectors. The initial development involved contributions from SCS members and affiliates, building on earlier prototypes to create a comprehensive that addresses both theoretical underpinnings and practical implementation. At its core, the MSBK delineates key areas essential to M&S proficiency. Foundations encompass mathematical and statistical principles, such as , differential equations, and systems dynamics, which underpin model formulation and analysis. Processes cover the full spectrum of M&S activities, from requirements acquisition and model development to experimentation, analysis, and maintenance, emphasizing iterative lifecycles like the or agile adaptations. Applications span critical domains, including defense systems for and wargaming, as well as healthcare for outcome and optimization, highlighting M&S's role in decision support and . These areas are interconnected, with foundations informing processes and enabling domain-specific adaptations. The MSBK employs a structure aligned with cognitive development frameworks like to facilitate progressive learning and application. Each level incorporates defined competencies, such as designing verifiable models or interpreting simulation outputs, accompanied by curated references to seminal works and standards for deeper exploration. This approach ensures scalability, allowing users to build expertise from introductory principles to specialized implementations. Subsequent revisions have expanded the MSBK to incorporate emerging technologies, with post-2020 updates integrating data science techniques—such as machine learning for model calibration—and virtual/augmented reality (VR/AR) for immersive simulation environments, reflecting the discipline's evolution toward hybrid and data-driven paradigms. The framework is freely accessible through the SCS website, where users can download the core index and supporting materials, including taxonomies and errata, to promote widespread adoption. Overall, the MSBK significantly impacts the field by guiding academic curricula and professional certifications, such as the Certified Modeling and Simulation Professional (CMSP), while bridging gaps between academia, industry, and government through standardized knowledge dissemination and collaborative updates.

References

  1. [1]
    [PDF] INTRODUCTION TO MODELING AND SIMULATION
    MOdeling and simulation constitute a powerful method for designing and evaluating complex systems and processes, and knowledge of modeling and simulation.
  2. [2]
    [PDF] ESD.77 Lecture 3, Modeling and simulation - MIT OpenCourseWare
    Simulation is the process of exercising a model for a particular instantiation of the system and specific set of inputs in order to predict the system response.
  3. [3]
    None
    ### Summary of Definition and Introduction to Simulation Modeling from Chapter 1 of Law and Kelton
  4. [4]
    M&S Definitions – Modeling and Simulations - NASA
    The process of determining the degree to which an operating model or simulation is or provides an accurate representation of the real world from the perspective ...
  5. [5]
    Simulation & Modeling - NASA
    Modeling and simulation are critical for human spaceflight as they enable in-depth analysis, assessment and verification of spacecraft and mission performance.<|control11|><|separator|>
  6. [6]
    Modeling and Simulation
    Simulation in general is to pretend that one deals with a real thing while really working with an imitation. In operations research the imitation is a computer ...
  7. [7]
    Computer Simulations in Science
    May 6, 2013 · A computer simulation is a program that is run on a computer and that uses step-by-step methods to explore the approximate behavior of a mathematical model.
  8. [8]
    Data Modeling Explained: Conceptual, Physical, Logical - Couchbase
    Oct 7, 2022 · Data modeling has three stages: conceptual (high-level), logical (technical details), and physical (implementation in a database).Conceptual Vs. Logical Vs... · Logical Data Model · How Are Conceptual, Logical...
  9. [9]
    Fidelity - AcqNotes
    Fidelity is the accuracy of the model or simulation when compared to the real world. Simulation fidelity has to do with how well the simulation responds.
  10. [10]
    What Is Data Modeling? | IBM
    Data can be modeled at various levels of abstraction. ... The process will start with a conceptual model, progress to a logical model and conclude with a physical ...
  11. [11]
    MODEL Definition & Meaning - Merriam-Webster
    2025 See All Example Sentences for model. Word History. Etymology. Noun and Verb. Middle French modelle, from Old Italian modello, from Vulgar Latin *modellus ...
  12. [12]
  13. [13]
    [PDF] Modeling & Simulation | UTRGV
    A computer simulation includes the analytical model that is represented in executable code, the input conditions and other input data, and the computing.
  14. [14]
    [PDF] Systems Analysis, Modeling, and Simulation
    Present a brief overview of systems analysis using the methods of systems modeling and systems simulation. • Describe the utility of systems analysis, modeling,.
  15. [15]
    Ancient Babylonian Astronomy - MPIWG
    Babylonian astronomy, dating back to 2000 BC, involved complex, exact scientific theories and stellar measurements using the ecliptical coordinate system as ...
  16. [16]
    A Model of the Cosmos in the ancient Greek Antikythera Mechanism
    Mar 12, 2021 · The Antikythera Mechanism, an ancient Greek astronomical calculator, has challenged researchers since its discovery in 1901.
  17. [17]
    The Engines | Babbage Engine - Computer History Museum
    Babbage began in 1821 with Difference Engine No. 1, designed to calculate and tabulate polynomial functions. The design describes a machine to calculate a ...
  18. [18]
    Full article: A Pioneer of Naval Architecture - Taylor & Francis Online
    Jul 20, 2020 · He discovered the physical laws, so-called Froude's laws, by which the hydrodynamic performance of the small-scale ship model could be converted ...Missing: 1870s simulation
  19. [19]
    [PDF] Logistics Models: Evolution and Future Trends - RAND
    This paper has been written for the Military Operations. Research Society (MORS) as a candidate chapter on Logistics Modeling for a forthcoming monograph on ...
  20. [20]
    Hitting the Jackpot: The Birth of the Monte Carlo Method | LANL
    Nov 1, 2023 · Remarkably, the ingenious physicist Enrico Fermi independently invented the fundamentals of Ulam's random sampling method back in the 1930s ...Missing: 1940s | Show results with:1940s
  21. [21]
    Cybernetics or Control and Communication in the Animal and the ...
    At the core of Wiener's theory is the message (information), sent and responded to (feedback); the functionality of a machine, organism, or society depends on ...
  22. [22]
    UNIVAC - CHM Revolution - Computer History Museum
    For memory, the UNIVAC used seven mercury delay-line tanks. Eighteen pairs of crystal transducers in each tank transmitted and received data as waves in mercury ...
  23. [23]
    Theory of self-reproducing automata : Von Neumann, John, 1903 ...
    Jun 24, 2015 · Theory of self-reproducing automata ; Publication date: 1966 ; Topics: Machine theory ; Publisher: Urbana, University of Illinois Press ; Collection ...
  24. [24]
    History - Promotion of computer arts and simulation
    SCS is a membership Society founded in 1952 and is dedicated to promoting Modeling and Simulation in the service society.
  25. [25]
    History - Winter Simulation Conference 2022
    Because of the technical and financial success of the 1967 conference, a second conference was held December 2-4, 1968, at the Hotel Roosevelt in New York City.
  26. [26]
    Science: NATO's “third dimension”
    Feb 4, 2015 · In March 1958, a NATO Science Committee, composed of experts from Allied countries, met for the first time to evaluate, oversee and fund the ...Missing: conference | Show results with:conference
  27. [27]
    [PDF] High Performance Computing – Past, Present and Future
    In the 1980's and into the 90's, many companies were set up to market DM and SM parallel computers for high performance scientific computing.
  28. [28]
    [PDF] THE DoD HIGH LEVEL ARCHITECTURE: AN UPDATE1
    The High Level Architecture (HLA) provides the specification of a common technical architecture for use across all classes of simulations in the US ...
  29. [29]
    Dynamic Similarity – Introduction to Aerospace Flight Vehicles
    Geometric similarity means the sub-scale model is correctly scaled and has the same geometric shape as the actual application. Thus, a single scaling parameter ...<|separator|>
  30. [30]
    [PDF] Experimentation on Analogue Models - PhilArchive
    Feb 17, 2015 · Summary. Analogue models are actual physical setups used to model something else. They are especially useful when what we wish to ...
  31. [31]
    [PDF] Physical Modeling of Rivers in the Hydraulics Laboratory
    Sep 6, 2021 · Distorted scale physical modeling is an effective means of modeling long stretches of rivers where depths would be too shallow if scaled using ...
  32. [32]
    Structural Behavior of a Prototype UHPC Pi-Girder
    Dec 3, 2012 · This experimental investigation focused on the structural behavior of a newly developed highway bridge girder cross section, the pi-girder. This ...
  33. [33]
    In Pursuit of CFD-based Wind Tunnel Calibrations
    Computational fluid dynamic simulations of models tested in wind tunnels require a high level of fidelity and accuracy, particularly for the purposes of CFD ...
  34. [34]
    Biomechanics - NHTSA
    NHTSA's Anthropomorphic Test Devices (ATDs), or crash test dummies, are used extensively as part of regulation or consumer metric testing programs. They are ...
  35. [35]
    Froude, Froude number, Lagrange celerity, Victor Miguel Ponce
    Between 1863 and 1867, Froude showed that scaling between model and prototype was possible only when the speed (V) of the ship was proportional to the square ...Missing: derivation | Show results with:derivation
  36. [36]
    [PDF] Dimensional Analysis of Models and Data Sets: Similarity Solutions ...
    Jan 2, 2005 · Summary: This essay describes a three-step procedure of dimensional analysis that can be applied to all quantitative models and data sets.
  37. [37]
    [PDF] COGNITIVE EFFECTS OF PHYSICAL MODELS IN ENGINEERING ...
    ... physical models ... Physical Models in Engineering Design. Physical models refer to prototypes, of any scale ... Advantages. Disadvantages. Other comments.
  38. [38]
    [PDF] Computational Fluid Dynamics and Physical Model Comparisons of ...
    tall structures are physical modeling (wind or water tunnels) and computational fluid dynamics. (cfd). A case study is considered where both methods are ...
  39. [39]
    Practices of Science: Using Models - University of Hawaii at Manoa
    Although models allow scientists to represent ideas and make predictions, models also have drawbacks that must be understood in order to use them effectively.
  40. [40]
    [PDF] The Wright Way: The Process of Invention pdf - NASA
    Building and flying model gliders helped the Wright brothers learn and understand the importance of weight and balance in airplanes. If the weight of the ...
  41. [41]
    [PDF] A Tutorial on Simulation Conceptual Modeling
    The conceptual model describes the computer simulation model that will be, is or has been developed. This statement identifies the conceptual model as a ' ...
  42. [42]
    A Comparison of Deterministic and Stochastic Modeling Approaches ...
    Here, we compare those two common modeling approaches, aiming at identifying parallels and discrepancies between deterministic variables and possible stochastic ...
  43. [43]
    [PDF] Chapter 11. Dynamic models and their simulation by Euler's method
    Euler's method for ordinary differential equations (ODEs). Euler (1707-1783) wanted a numeric solution of an ODE dx/dt = h(x) with an initial condition x(0) ...
  44. [44]
    Agent-based modeling: Methods and techniques for simulating ...
    May 14, 2002 · Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real- ...
  45. [45]
    Parameters vs. variables: what to use? - AnyLogic Help
    A variable represents a model state, and may change during simulation. A parameter is commonly used to describe agents statically.Variables · What to use: parameters or...
  46. [46]
    Partial derivative—Based sensitivity analysis of models describing ...
    Sensitivity analysis is commonly used to characterize the effects of parameter perturbations on model output. One use for the approach is the optimization of an ...
  47. [47]
    Merits and Limitations of Mathematical Modeling and Computational ...
    Aug 11, 2021 · In Table 2 an overview of COVID-19 mathematical models highlighting the study type, model related, advantages and disadvantages are summarized.
  48. [48]
    Alfred J. Lotka and the origins of theoretical population ecology - PMC
    Aug 4, 2015 · The equations describing the predator–prey interaction eventually became known as the “Lotka–Volterra equations,” which served as the ...
  49. [49]
    [PDF] CHAPTER 5 NEXT-EVENT SIMULATION - DMI Unict
    The primary time-advance mechanism used in discrete-event simulation is known as next-event time advance; this mechanism is typically used in conjunction ...Missing: pseudocode | Show results with:pseudocode
  50. [50]
    Module 8: Next-Event Simulation
    This gave us an algorithm for computing (simulating) the concentrations at any time. ... clock = event.time // Advance the clock to the next event's time of ...
  51. [51]
    What is Discrete Event Simulation? - MoreSteam
    Jan 11, 2025 · Queues: Holding areas for entities waiting for resources (e.g., customers in line). Resources: Limited assets required to complete tasks ...
  52. [52]
    Discrete Event Simulation | Autodesk
    In discrete event simulation, the system is broken down into entities, events, and resources. Entities represent the objects moving through the system, such as ...
  53. [53]
    [PDF] Chapter 2 - POISSON PROCESSES - MIT OpenCourseWare
    A Poisson process is a simple and widely used stochastic process for modeling the times at which arrivals enter a system.
  54. [54]
    Discrete Event Simulation for Manufacturing Performance ...
    Apr 30, 2020 · The applications can be designs of production processes, optimization of facility layouts and process parameters, and improvement of performance ...<|separator|>
  55. [55]
    Discrete-Event Simulation - MATLAB & Simulink Solutions
    Discrete-event simulation with Simulink provides capabilities for analyzing and optimizing event-driven communications and operations.
  56. [56]
    [PDF] Chapter 5 – M/M/1 Queuing Systems - KTH
    Nov 11, 2012 · 1. We first derive the state distribution (steady-state) of this system through the solution of the balance equations. We define ρ = λ/(µC ...
  57. [57]
    [PDF] Continuous Systems Lecture: Numerical Integration of ODEs | EECS ...
    Generalization: Runge-Kutta Methods. A Runge-Kutta method is given by x1 = x0 + h(b1k1 + ... + bsks) where k1. = f(t0, x0) k2. = f(t0 + c2h, x0 + ha21k1) k3.
  58. [58]
    [PDF] A State Event Detection Algorithm for Numerically Simulating Hybrid ...
    This paper describes an algorithm for detecting the occurrence of events, which signify discontinu- ities in the first derivative of the state variables, ...
  59. [59]
    (PDF) Stiff Equations - ResearchGate
    Aug 6, 2025 · In the last time, stiff differential equations have been studied extensively and various methods for their solutions have been proposed.
  60. [60]
    Physical Simulation - an overview | ScienceDirect Topics
    Continuous modeling Continuous simulations deal with the modeling of physical events (processes, behaviors, conditions) that can be described by some set of ...
  61. [61]
    [PDF] CHAPTER 4:The Material Balance for Chemical Reactors
    Equation 4.48 is clearly a total mass balance, in which the total mass in the reactor changes in time due to the inflow and outflow of mass. Notice that ...Missing: VC | Show results with:VC
  62. [62]
    Q&A: How do climate models work? - Carbon Brief
    Jan 15, 2018 · The most important of these are the Navier-Stokes equations of fluid motion, which capture the speed, pressure, temperature and density of the ...
  63. [63]
    General Circulation Model - an overview | ScienceDirect Topics
    General circulation models (GCM)​​ A mathematical model of the general circulation of a planetary atmosphere or ocean. It is based on the Navier–Stokes equations ...
  64. [64]
    [PDF] An Introduction to Density Functional Theory
    For the past 30 years density functional theory has been the dominant method for the quantum mechanical simulation of periodic systems.
  65. [65]
    A Quantum Chemical View of Density Functional Theory
    A comparison is made between traditional quantum chemical approaches to the electron correlation problem and the one taken in density functional theory (DFT).
  66. [66]
    [PDF] Finite Element Analysis (FEA) or Finite Element Method (FEM)
    The Finite Element Analysis (FEA) is a numerical method for solving problems of engineering and mathematical physics. Useful for problems with complicated.
  67. [67]
    [PDF] Method of Finite Elements I
    Apr 30, 2010 · The course covers 2D elements, continuum elements (plane stress/strain), structural elements (plate elements), and 2D vs 3D formulations.
  68. [68]
    Modeling and Simulation of Aerospace Vehicle Dynamics, Fourth ...
    Aug 1, 2025 · The simulations include three-, five-, and six-DoF models of air-to-air, air-to-ground and surface-to-air missiles, UAVs, aircraft, hypersonic ...
  69. [69]
    Interactive Simulations | Glenn Research Center - NASA
    Jul 17, 2024 · NASA Glenn Research Center developed this collection of interactive simulation exercises to accompany our Beginners Guide to Aeronautics educational content.
  70. [70]
    [PDF] Useful Pharmacokinetic Equations - UF College of Pharmacy
    Useful Pharmacokinetic Equations. Symbols. D = dose τ = dosing interval ... For One Compartment Body Model. If the dosing involves the use of I.V. bolus.
  71. [71]
    One-compartment kinetics - PubMed
    The mathematical development of the equations needed to determine the plasma concentrations of drug in a one-compartment pharmacokinetic model
  72. [72]
    12 steps - From idea to discovery | CERN
    To understand precisely how new particles could show up and how the detectors need to perform, scientists simulate a very large number of collisions. These ...
  73. [73]
    Impact of detector simulation in particle physics collider experiments
    What a simulation does is to teach physicists what mark the Higgs boson would leave in the real detector if it were present in the data sample. For instance, ...
  74. [74]
    Geant4 - CERN
    Geant4 is a toolkit for simulating particle passage through matter, used in high energy, nuclear, accelerator, medical, and space science.Download Geant4-11.3.2 · Geant4 Documentation · Collaboration · Getting StartedMissing: discoveries | Show results with:discoveries
  75. [75]
    AlphaFold - Google DeepMind
    AlphaFold has revealed millions of intricate 3D protein structures, and is helping scientists understand how all of life's molecules interact.Missing: post- | Show results with:post-
  76. [76]
    Chemistry Nobel goes to developers of AlphaFold AI that predicts ...
    Oct 9, 2024 · This year's prize celebrates computational tools that have transformed biology and have the potential to revolutionize drug discovery.
  77. [77]
    efficient discovery of a novel CDK20 small molecule inhibitor - NIH
    In 2020, the AlphaFold computer program predicted protein structures for the whole human genome, which has been considered a remarkable breakthrough in both AI ...Missing: simulations post-
  78. [78]
    An Explanation of Generic Behavior in an Evolving Financial Market
    The Santa Fe Artificial Stock Market [13, 4] is an agent-based artificial model in which agents continually explore and develop expectational models, ...
  79. [79]
    [PDF] DSGE Model-Based Forecasting - Federal Reserve Bank of New York
    The term DSGE model encompasses a broad class of macroeconomic models that spans the standard neoclassical growth model discussed in King, Plosser, and Rebelo ( ...Missing: A_t K_t^ L_t^{
  80. [80]
    Agent-based models move into the economic mainstream
    Oct 27, 2025 · In the 1980s and '90s, SFI played laboratory to a promising new method known as agent-based modeling. Instead of relying on averages and ...
  81. [81]
    [PDF] Building the Santa Fe Artificial Stock Market - Economics
    This paper decribes the Santa Fe Artificial Stock market from the perspective of one of the builders. Section two covers some of the early history of agent- ...
  82. [82]
    [PDF] On the Mechanics of New-Keynesian models
    The New-Keynesian model—a dynamic stochastic general equilibrium (DSGE) model with sticky prices—has become a workhorse in the analysis of monetary policy. It ...Missing: A_t K_t^ L_t^{
  83. [83]
    A contribution to the mathematical theory of epidemics - Journals
    The disease spreads from the affected to the unaffected by contact infection. Each infected person runs through the course of his sickness, and finally is ...
  84. [84]
    Mathematical epidemiology: Past, present, and future - PMC
    The basic compartmental models to describe the transmission of communicable diseases are contained in a sequence of three papers by Kermack and McKendrick, 1927 ...
  85. [85]
    [PDF] INTRODUCTION TO SUPPLY CHAIN SIMULATION
    This tutorial will detail the reasons why one would want to use simulation as the analysis methodology to evaluate supply chains, its advantages and ...
  86. [86]
    [PDF] Modeling Supply Chain Dynamics: A Multiagent Approach*
    Our aim in this paper is to provide a flexible and reusable modeling and simulation framework that enables rapid development of customized decision support ...
  87. [87]
    [PDF] Monte-Carlo Methods for Risk Management - Martin Haugh
    We focus on importance sampling and stratified sampling, both of which are variance reduction techniques that can be very useful in estimating risk measures ...Missing: seminal | Show results with:seminal<|separator|>
  88. [88]
    [PDF] Using Agent-Based Models for Analyzing Threats to Financial Stability
    Dec 21, 2012 · This paper argues that agent-based models (ABMs)—which seek to explain how the behavior of individual firms or “agents” can affect outcomes in ...
  89. [89]
    Cellular automata models for simulation and prediction of urban ...
    This review provides a comprehensive examination of the development of urban cellular automata (UCA) models, presenting a new framework to enhance individual ...
  90. [90]
    Researchers uncover AI bias against older working women
    Oct 17, 2025 · AI is perpetuating inaccurate gender and age stereotypes, influencing everything from hiring practices to workplace perceptions.<|control11|><|separator|>
  91. [91]
    The Sociodemographic Biases in Machine Learning Algorithms - NIH
    May 21, 2024 · The biases include those related to some sociodemographic characteristics such as race, ethnicity, gender, age, insurance, and socioeconomic status.
  92. [92]
    Simulation Infrastructure Management – AWS SimSpace Weaver
    AWS SimSpace Weaver makes it easier to build and run large-scale spatial simulations in the cloud—letting AWS manage the scale and complexity so you can focus ...
  93. [93]
    Verification, Validation, and Accreditation | www.dau.edu
    - Accreditation: The official certification that a model or simulation and its associated data are acceptable for a specific purpose or use. Acronym. VV&A.
  94. [94]
    [PDF] Verification, Validation and Accreditation of Simulation Models
    ABSTRACT. This paper presents guidelines for conducting verifica- tion, validation and accreditation (VV&A) of simulation models.
  95. [95]
    [PDF] Automated Multi-Agent Simulation Generation and Validation
    Dec 9, 2010 · The chi-square test is one of the most used for simulation analysis since it has very few requirements.
  96. [96]
    [PDF] DoD Instruction 5000.61, "DoD Modeling and Simulation Verification ...
    Sep 17, 2024 · Establishes policy, assigns responsibilities, and prescribes procedures for the verification, validation, and accreditation (VV&A) of models, ...Missing: 1996 | Show results with:1996
  97. [97]
    IEEE 1012-2024 - IEEE SA
    Aug 22, 2025 · The Verification and Validation (V&V) processes are used to determine whether the development products of a given activity conform to the ...
  98. [98]
    A Bayesian approach for quantification of model uncertainty
    In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a ...
  99. [99]
    [PDF] Modeling and Simulation Behavior Validation Methodology ... - DTIC
    Mar 25, 2015 · Turing Tests. Individuals who are knowledgeable about the operations of the system being modeled are asked if they can discriminate between ...Missing: analogs | Show results with:analogs
  100. [100]
    Bachelor of Science in Computer Engineering - IDP Education
    Old Dominion University is the first and only university in the nation to offer an undergraduate major in modeling and simulation engineering. The ECE ...
  101. [101]
    Modeling and Simulation Engineering Old Dominion University
    Old Dominion University is the first and only university in the nation to offer an undergraduate degree in modeling and simulation engineering. Average ...
  102. [102]
    Degree Programs - MOVES Institute - Naval Postgraduate School
    The Modeling, Virtual Environments and Simulation (MOVES) Academic Program at the Naval Postgraduate School grants Masters of Science and Ph.D. degrees.
  103. [103]
    Doctor of Philosophy - MOVES Institute - Naval Postgraduate School
    The MOVES Ph.D. program generally builds on the scientific knowledge gained from the MOVES MS program. An applicant should have a master's degree in modeling, ...
  104. [104]
    Modeling, Virtual Environments, and Simulation (MOVES) PhD
    The Modeling, Virtual Environments and Simulation (MOVES) Academic Program of the Naval Postgraduate School provides the Ph.D. student both fundamental and ...Missing: master's | Show results with:master's
  105. [105]
    Modeling, Virtual Environments, and Simulation (MOVES)
    The MS program is an eight-quarter program whose core covers the fundamentals of modeling and simulation, data analysis, visual simulation, intelligent systems ...
  106. [106]
    Modeling & Simulation Engineering (Engineering, M.S.)
    The program prepares students for careers in simulation, with thesis and course options. It has a strong core, and is designed for students with engineering,  ...
  107. [107]
    Links: M&S (Modeling and Simulation), profession, associations ...
    Mar 16, 2008 · , Rationale, Other Sources. Certification for CMSP (Certified Modeling and Simulation Professional) designation: ... SCS - M&S Magazine; SISO ...
  108. [108]
    CMSP - Certified Modeling & Simulation Professional Certification ...
    The CMSP designation recognizes professionals with work, education, knowledge, skills, and abilities in M&S.
  109. [109]
    CMSP Certification - National Training & Simulation Association
    NTSA manages the Certified Modeling & Simulation Professional (CMSP) certification program, which recognizes professionals with extensive experience and ...
  110. [110]
    [PDF] Certified Modeling and Simulation Professional
    Dec 10, 2023 · ➢ CMSP is the only comprehensive M&S professional certification in the U.S. ... ➢ Submit articles on CMSP to M&S publications (SISO, SCS, Etc.).
  111. [111]
    Modeling and Simulation | University of Central Florida - Orlando, FL
    Ranked among the top programs in the nation, UCF modeling and simulation degrees provide students with the knowledge needed to engineer a successful future.Missing: Europe Asia examples
  112. [112]
    7 Universities offering Modelling / Simulation Systems courses abroad
    7 Universities offering Modelling / Simulation Systems courses abroad · UCL (University College London) · Toronto Metropolitan University · The Hong Kong ...
  113. [113]
    Multiscale Analysis, Modelling and Simulation
    Waseda University and Darmstadt University of Technology have been running a joint program in education and research in mathematical fluid dynamics for ...
  114. [114]
    Online modeling and simulation degree program opening doors to ...
    The program is designed to give students the technical training to gain advanced skills to create, formulate, develop and prove the effectiveness of solutions.Missing: 2010 | Show results with:2010
  115. [115]
    [PDF] Report #1 The Future of Modeling and Simulation
    Sep 16, 2024 · Examples of universities offering modeling and simulation as a degree-granting academic discipline include the. University of Central Florida ...
  116. [116]
    NASA Transition Continues to Spur University, Industry Partnerships
    Through the partnership, high school students can interact with and be mentored by NASA Langely researchers on college-level STEM projects while learning about ...Missing: simulation defense skills
  117. [117]
    Welcome to MOVES - MOVES Institute - Naval Postgraduate School
    MOVES is an interdisciplinary research and academic program dedicated to education and research in all areas of defense modeling and simulation.Degree Programs · Healthcare Modeling and · News and Events · ProjectsMissing: PhD | Show results with:PhD
  118. [118]
  119. [119]
    [PDF] 2025-forum-program.pdf - AIAA
    Jan 5, 2025 · AI/ML and Autonomy Software Tools, Modelling and Simulation, and Scenarios ... more than 960 industry partners. ONR, through its commands ...
  120. [120]
    [PDF] Modeling and Simulation: Body of Knowledge - VMASC
    Realistic Roadmap for Developing a Modeling and Simulation. Body of Knowledge Index. Proceedings of SISO (Simulation. Interoperability Standards Organization) ...Missing: MSBK | Show results with:MSBK<|control11|><|separator|>
  121. [121]
    None
    ### Summary of Modeling and Simulation Body of Knowledge (M&S BoK) Index
  122. [122]
    Body of Knowledge for Modeling and Simulation - SpringerLink
    In stockThis seminal handbook addresses fundamental and core topics of the discipline of modeling and simulation, covering key foundations and application domains.Missing: MSBK | Show results with:MSBK
  123. [123]
  124. [124]
    Modeling and Simulation Body of Knowledge - VMASC
    This valuable new handbook provides intellectual support for all disciplines in analysis, design and optimization.Missing: MSBK | Show results with:MSBK