Fact-checked by Grok 2 weeks ago

Phenomenological model

A phenomenological model is a scientific construct that captures the empirical relationships among observed phenomena, focusing on macroscopic behaviors and trends derived from experimental data rather than from fundamental first principles or detailed microscopic explanations. These models prioritize descriptive accuracy for practical predictions, often incorporating adjustable parameters fitted to observations to represent essential without resolving underlying causal mechanisms. In physics, phenomenological models serve as vital intermediaries between abstract theoretical frameworks and experimental outcomes, enabling quantitative forecasts of observable effects. For instance, in particle physics, they are employed to interpret collider data by parameterizing interactions within the Standard Model, such as predicting cross-sections for particle collisions or decay rates, thereby testing theoretical predictions against real-world measurements. Similarly, in condensed matter physics and materials science, these models describe phenomena like creep deformation in alloys or gas transport in porous media, using simplified equations to replicate nonlinear and chaotic behaviors observed in experiments. Their development traces back to early 20th-century efforts, such as the of the atom, which empirically fitted spectral lines without full quantum mechanical foundations, evolving into sophisticated tools in post-World War II high-energy physics. The strengths of phenomenological models lie in their computational efficiency and ability to handle complex, data-rich scenarios where full mechanistic derivations are infeasible or overly resource-intensive. They facilitate and validation in fields like , where models simulate explosion dynamics or processes using probabilistic or rate-based approximations. However, limitations include their reliance on empirical fitting, which can hinder to untested regimes, and their inability to provide deep causal insights, potentially masking fundamental inconsistencies if the underlying theory evolves. Despite these constraints, such models remain indispensable for advancing scientific understanding by grounding theoretical abstractions in tangible evidence.

Definition and Characteristics

Definition

A phenomenological model is a scientific construct that describes empirical relationships between observable macroscopic phenomena, often without detailed invocation of underlying microscopic mechanisms, though they may draw on simplified aspects of fundamental theories. This approach prioritizes capturing the behavior of complex systems through simplified mathematical representations derived directly from experimental observations, rather than deriving equations from first principles. In physics, the broader concept of phenomenology involves applying theoretical frameworks to interpret and predict experimental data, and phenomenological models contribute to this by focusing on empirical and parameterization, often incorporating elements from established theories to describe effects without full causal detail. In this sense, they serve as practical approximations that emphasize descriptive accuracy over explanatory depth, distinguishing them from fully mechanistic analyses that derive from fundamental laws. The basic principles of phenomenological modeling center on fitting experimental data to approximate system behavior using simplified equations, often through parameter optimization techniques that ensure the model reproduces key empirical trends. This data-driven process allows for effective representation of in scenarios where microscopic details are inaccessible or computationally prohibitive, enabling broader applicability across scientific domains.

Key Features

Phenomenological models are fundamentally empirical in nature, with their parameters determined through data-driven parameterization rather than derivation from underlying physical . This approach relies on observed to establish relationships between variables, allowing the models to capture real-world behaviors without requiring a complete theoretical foundation. For instance, parameters are often fitted to experimental datasets to represent system responses accurately under specific conditions. A key distinguishing feature is their macroscopic focus, which emphasizes , large-scale phenomena while deliberately ignoring microscopic details such as or molecular interactions. This enables the models to describe effects, such as material properties or system-level , in a way that aligns with experimental observations without delving into sub-scale complexities. The resulting is another hallmark, as these models typically involve fewer parameters than their mechanistic counterparts, reducing computational demands and enhancing practicality for applications. Methodologically, phenomenological models utilize curve-fitting techniques to align their equations with empirical data, ensuring a close match to measured outcomes. They often incorporate laws to extend predictions across varying scales or conditions, providing a for based on proportional relationships derived from experiments. Validation centers on predictive accuracy, where the models are assessed by their ability to forecast responses to unseen data, confirming reliability for practical use. In structure, these models are commonly formulated as algebraic or differential equations that directly link inputs to outputs, facilitating straightforward implementation. A representative example is the stress-strain relation in , where equations describe the overall mechanical deformation of a under load, parameterized from tensile tests without reference to bonding. This form allows for efficient of macroscopic responses in fields like .

Historical Development

Origins in Physics

The origins of phenomenological models in physics can be traced to the early , particularly within , where scientists sought to describe macroscopic behaviors of engines through empirical relations without relying on microscopic mechanisms. A seminal example is the , proposed by Sadi Carnot in 1824, which modeled the efficiency of ideal engines operating between two temperature reservoirs using reversible processes of isothermal expansion, adiabatic expansion, isothermal compression, and adiabatic compression. This approach treated as a fluid-like substance and derived efficiency limits based on observed performance data from steam engines, predating the of Boltzmann and Gibbs by decades. A key milestone in the application of phenomenological modeling occurred in and during the 1810s and 1820s, as researchers developed equations to capture light propagation and patterns empirically, without a complete underlying . Augustin-Jean Fresnel's equations, formulated around 1823, described the reflection and coefficients at interfaces between by fitting experimental observations of and , assuming light as transverse in an elastic . These relations successfully predicted phenomena like and , bridging empirical data with wave before Maxwell's full electromagnetic unification. In the early , phenomenological models gained further prominence as precursors to , particularly in explaining transport properties of solids through data-fitting approximations. The , introduced by Paul Drude in 1900, treated in metals as arising from a classical gas of free electrons scattering off ionic lattices, empirically matching resistivity measurements as a function of temperature and material properties. This semi-classical framework, while oversimplifying quantum effects, provided a foundational empirical tool for understanding metallic conduction until refined by Sommerfeld's quantum statistical approach in 1927.

Evolution in Other Fields

The phenomenological modeling paradigm, initially rooted in physics, extended to during the mid-20th century, particularly in the study of reaction kinetics. In this context, equations like the Arrhenius relation, originally proposed in 1889, were applied phenomenologically to parameterize rate constants as functions of temperature, capturing empirical dependencies without incorporating underlying quantum mechanical processes. This approach facilitated practical predictions of reaction rates in complex systems, such as catalytic processes and , by focusing on observable macroscopic behaviors rather than microscopic mechanisms. From the onward, phenomenological models gained traction in and , adapting the method to describe emergent patterns from empirical . In , the Lotka-Volterra equations, proposed in the , exemplified this approach and saw increased application with the rise of computational tools to simulate predator-prey based solely on observed cycles and interaction rates, without deriving parameters from first-principle ecological mechanisms. Similarly, in , the approach informed growth and development models during this period, such as those analyzing aggregate production and demographic trends through fitted functional forms that mirrored historical patterns, enabling forecasts of macroeconomic trajectories. The late 20th and early 21st centuries marked a surge in interdisciplinary applications, driven by the integration of phenomenological models with computational tools from the 1980s to 2000s. This evolution enabled the creation of hybrid frameworks in , where simple phenomenological components—such as energy balance models—were embedded within larger simulations to fit and predict patterns like global temperature anomalies and hydrological cycles. For instance, Budyko-type models, which empirically relate to and , were computationally scaled to assess long-term responses, bridging observational data with broader dynamical simulations.

Applications

In Physics

In physics, phenomenological models play a crucial role in describing complex phenomena where microscopic details are intractable, by incorporating empirical parameters into simplified theoretical frameworks that align observed with fundamental principles. These models often serve as effective theories, valid within or length scales, allowing predictions without full derivation from underlying quantum or kinetic descriptions. A prominent example is the Ginzburg-Landau theory of superconductivity, developed in 1950, which provides a phenomenological description of the superconducting phase transition near the critical temperature. This theory introduces an order parameter \psi, representing the macroscopic wave function of Cooper pairs, and formulates the free energy as a functional of \psi and the magnetic vector potential \mathbf{A}. The key equation is the Ginzburg-Landau free energy density: F = \alpha |\psi|^2 + \frac{\beta}{2} |\psi|^4 + \frac{1}{2m} |(-i\hbar \nabla - 2e\mathbf{A})\psi|^2 + \frac{B^2}{8\pi}, where \alpha and \beta are phenomenological coefficients determined from experimental data, such as specific heat measurements, enabling the model to predict properties like the penetration depth and coherence length without relying on the full microscopic Bardeen-Cooper-Schrieffer theory. Minimizing this functional yields the nonlinear differential equations governing the spatial variation of \psi and \mathbf{A}, which successfully explain phenomena like the intermediate state in type-I superconductors and vortex lattices in type-II materials. In plasma physics, magnetohydrodynamics (MHD) exemplifies a phenomenological approach by treating the plasma as a single conducting fluid, combining fluid dynamics equations with Maxwell's electromagnetism while incorporating empirical transport coefficients to account for microscopic effects like collisions and resistivity. The ideal MHD equations assume infinite conductivity but are often extended with phenomenological terms, such as a resistive term \eta \mathbf{J} in Ohm's law, where \eta is fitted from experimental transport data rather than derived from kinetic theory. This approximation captures the collective behavior of plasmas in fusion devices and astrophysical settings, such as magnetic reconnection in solar flares, by bridging macroscopic fluid motion with electromagnetic forces without solving the full Vlasov-Maxwell system for particle distributions. In , effective field theories provide a systematic phenomenological framework for low-energy phenomena, parameterizing interactions with coefficients constrained by experimental data and symmetry principles. , for instance, describes interactions in at energies below 1 GeV, expanding the effective in powers of momenta and masses around the chiral limit where quarks are massless. The leading-order includes terms like \frac{f^2}{4} \langle \partial^\mu U \partial_\mu U^\dagger + \chi U^\dagger + U \chi^\dagger \rangle, with f (the decay constant) and other low-energy constants fitted from scattering lengths and form factors measured in - collisions. This approach reproduces QCD predictions in the non-perturbative regime, offering quantitative insights into processes like \pi^0 \to \gamma\gamma decay rates.

In Engineering and Materials Science

In and , phenomenological models are widely employed to capture complex material behaviors through empirical relationships derived from experimental data, enabling practical simulations and design without delving into microscopic mechanisms. These models prioritize predictive accuracy for engineering applications, such as and failure prediction, by parameterizing observed phenomena like nonlinearity and . A prominent example is the Ramberg-Osgood equation, introduced in 1943, which describes the nonlinear stress-strain response of metals beyond the elastic limit. This model combines linear elastic behavior with a power-law term for plastic deformation, fitted directly to experimental stress-strain curves from tensile tests on materials like aluminum alloys, without relying on underlying dislocation dynamics. The equation is given by \epsilon = \frac{\sigma}{E} + \alpha \frac{\sigma_0}{E} \left( \frac{\sigma}{\sigma_0} \right)^{n-1}, where \epsilon is the total strain, \sigma is the stress, E is Young's modulus, \sigma_0 is a reference stress (often 0.7% offset yield strength), \alpha is a dimensionless constant, and n is the hardening exponent. This approach has been integral to for predicting material and in components under monotonic loading. In , phenomenological turbulence models such as the k-ε model are essential for (CFD) simulations of engineering flows, like those in pipelines, aircraft wings, and heat exchangers. Developed in 1974, the k-ε model parameterizes turbulent eddy using two transport equations—one for turbulent (k) and one for its dissipation rate (ε)—calibrated against experimental data from various flow regimes, including boundary layers and jets, rather than resolving individual eddies. This semi-empirical framework approximates Reynolds stresses via Boussinesq hypothesis, enabling efficient predictions of mean flow characteristics and forces in industrial designs. For phase transition materials, phenomenological hysteresis models in shape memory alloys (SMAs), such as NiTi, utilize empirical landscapes to forecast deformation paths during martensitic transformations. A foundational 1986 model sketches the thermomechanical behavior by defining transformation surfaces in stress-temperature space with empirically determined critical stresses and loops, derived from calorimetric and mechanical tests on polycrystalline specimens. These models predict pseudoelastic recovery and memory effect for applications in actuators and stents, capturing path-dependent strain without explicit phase variant tracking.

Comparison with Other Modeling Approaches

Versus Mechanistic Models

Mechanistic models represent a bottom-up approach to modeling complex systems, deriving macroscopic behavior from fundamental physical laws and detailed descriptions of microscopic interactions. These models aim to provide causal explanations by explicitly incorporating the underlying mechanisms, such as conservation principles or inter-particle forces. A classic example is simulations, which track the trajectories of atoms and molecules governed by Newton's laws and functions to predict material properties like or elasticity. In contrast, phenomenological models adopt a top-down , focusing on macroscopic phenomena and fitting directly to experimental without resolving the full causal chain. This approach sacrifices detailed mechanistic insight for simplicity and computational efficiency, making it suitable for systems where full microscopic resolution is impractical. Mechanistic models, while offering deeper explanatory power from micro- to macro-scales, often demand extensive computational resources, precise , and comprehensive on interactions. A clear distinction appears in : the Navier-Stokes equations form a mechanistic framework, derived from and momentum to describe fluid motion at the continuum level based on first principles. Conversely, coefficients in , such as those used in empirical formulas for object resistance, represent phenomenological elements, calibrated from observed flow behaviors rather than deriving from atomic-scale interactions. This allows quick approximations in engineering design but limits understanding of underlying or boundary effects.

Versus Empirical Models

Empirical models represent data-interpolation techniques that establish relationships between inputs and outputs purely from observed data, without invoking underlying physical mechanisms; examples include lookup tables and black-box regressions such as neural networks trained exclusively on input-output pairs. These models prioritize predictive accuracy within the scope of available data but often treat parameters as abstract fitting coefficients lacking physical significance. Phenomenological models differ by imposing physical interpretability through parameterized equations that capture causal links between phenomena, such as power laws that describe relationships observed in natural systems. Unlike empirical approaches, which rely solely on statistical correlations, phenomenological models derive their structure from partial knowledge of the system's behavior, enabling parameters to reflect interpretable quantities like rates or exponents tied to real-world processes. This structured foundation contrasts with the data-bound nature of empirical models, which may overfit and fail to generalize beyond datasets. The trade-offs between these approaches highlight their complementary roles: phenomenological models provide superior extrapolation to unseen conditions within the validity of their embedded relations, as the causal structure supports predictions outside interpolated regimes. In contrast, empirical models excel at fitting high-dimensional or noisy data without imposing restrictive assumptions, leveraging abundant observations to achieve high fidelity in interpolation tasks where mechanistic details are unknown or complex.

Advantages and Limitations

Advantages

Phenomenological models offer significant computational due to their reduced , as they away detailed microscopic mechanisms in favor of macroscopic descriptions, enabling faster simulations and real-time computations in demanding applications. For instance, in control systems, these models facilitate rapid processing by requiring fewer equations and parameters compared to mechanistic counterparts, making them suitable for online optimization and loops in engineering processes. This is particularly evident in derivations using methods like the Manifold Boundary Approximation, which can simplify high-dimensional models—such as those in EGFR signaling pathways—from 48 parameters to as few as 4, drastically lowering simulation times while preserving key behavioral features. The ease of parameterization is another key advantage, stemming from the inherently low parameter space of phenomenological models, which minimizes the need for extensive data to fit variables and reduces issues like . With fewer identifiable parameters, often expressed as combinations of underlying microscopic ones, these models are more adaptable to new experimental datasets, supporting iterative in design workflows. For example, in materials simulations, phenomenological constitutive equations allow straightforward without resolving microstructural details, enhancing their utility in predictive tasks. Phenomenological models play a crucial bridging role by providing quick, practical approximations in areas where complete mechanistic theories are unavailable or computationally prohibitive, thereby accelerating progress in emerging fields. In , for instance, they enable effective modeling of heat transport in nano-systems through scaling relations that capture boundary effects without atomic-level simulations, aiding of . This intermediary approach balances essential physics with simplicity, as seen in simulators where phenomenological breakage models guide and without full mechanistic resolution.

Limitations and Criticisms

Phenomenological models often lack mechanistic insight, as they describe observed phenomena through empirical relations without elucidating the underlying causal processes responsible for those phenomena. This limitation means they fail to explain why certain relationships hold, such as the law's prediction of volume changes with temperature without revealing molecular interactions. Consequently, their diminishes outside the calibrated regimes, leading to breakdowns in extreme conditions like high pressures or non-equilibrium states where unaccounted factors dominate. The heavy reliance on experimental data for parameter fitting introduces sensitivity issues, including overfitting to specific datasets and non-uniqueness of parameters, where multiple parameter sets can yield similar outputs without capturing true system behavior. In the philosophy of science, this over-dependence is criticized for undermining explanatory depth, as models prioritize descriptive accuracy over generalizable understanding, reducing their role to mere curve-fitting rather than genuine scientific explanation. Post-2000 discussions have intensified critiques regarding their validity in complex systems, where hidden variables—such as unobserved interactions or environmental influences—undermine the empirical assumptions of phenomenological approaches. For instance, in biological or systems, these models struggle to account for emergent behaviors driven by latent factors, leading to unreliable generalizations and highlighting the need for more robust theoretical frameworks. Philosophers like Woodward (2017) and Rescorla (2018) have debated their explanatory status, arguing that while they may support counterfactual reasoning in simple cases, they falter in capturing constitutive mechanisms in multifaceted environments.

References

  1. [1]
    Phenomenological Model of Nonlinear Dynamics and Deterministic ...
    Jan 6, 2022 · A phenomenological model can also be defined as a scientific model that is not derived from first principles and describes the empirical ...
  2. [2]
    Particle Phenomenology | Department of Physics
    Particle physics phenomenology is the field of theoretical physics that focuses on the observable consequences of the fundamental particles of Nature and ...
  3. [3]
    Particle Physics Phenomenology | ICCUB
    Within the Standard Model, phenomenology represents making accurate predictions for ongoing or future experiments to test its limits of validy.
  4. [4]
    On the Importance of Phenomenology
    May 31, 2011 · ... phenomenological model to describe the structure of atoms and the response of ions to magnetic fields. Their model did not make much sense ...<|separator|>
  5. [5]
    Bridging Mechanistic and Phenomenological Models of Complex ...
    We discuss the advantages and limitations of this approach ... Furthermore, the forms for many of the simple, phenomenological models of physics were guessed long ...Results · Fig 2. Geodesic Reveals The... · Modeling AdaptationMissing: disadvantages | Show results with:disadvantages
  6. [6]
    Phenomenological Model - an overview | ScienceDirect Topics
    Phenomenological models are simplified physical models, which seek to represent only the essential physics of explosions.
  7. [7]
    The Limits of Phenomenology
    It is widely believed that theory is useful in physics because it describes simple systems and that strictly empirical phenomenological approaches are ...
  8. [8]
    Phenomenological model – Knowledge and References
    A phenomenological model is a type of mathematical or mechanics model that describes observed behavior or phenomena without attempting to model the underlying ...Missing: physics | Show results with:physics
  9. [9]
    Models in Science - Stanford Encyclopedia of Philosophy
    Feb 27, 2006 · Another approach, due to McMullin (1968), defines phenomenological models as models that are independent of theories. This, however, seems to be ...Semantics: Models and... · Epistemology: The Cognitive... · Models and Theory
  10. [10]
    phenomenology (physics) in nLab
    Aug 10, 2022 · Phenomenology is a part of theoretical physics concerned with a selection of a model (in particle physics) or parameters of a physical model for some physical ...
  11. [11]
    [PDF] Phenomenological Model of Deterministic-Chaotic Gas Migration in ...
    A phenomenological model can also be defined as a scientific model that is not derived from first principles and describes the empirical relationship of phenom-.
  12. [12]
    Difference analysis of phenomenological models with two variable ...
    ... phenomenological model used by mechanics workers to invert material ... Given its simplicity, versatility, predictive capability and accuracy, the ...
  13. [13]
    Phenomenological – Knowledge and References - Taylor & Francis
    A phenomenological model is defined as a mechanics model that describes the empirical relationships of macroscopic phenomena to each other, in a way that ...
  14. [14]
    [PDF] Development of a phenomenological scaling law for fractal ...
    The phenomenological model is a power law modification of the Frenkel sintering equation to include a dependence on the number of particles in a fractal ...
  15. [15]
    Phenomenological model of viscoelasticity for systems undergoing ...
    Mar 4, 2021 · The fits to these scaling laws lead to the estimation of the scaling exponents as s = 3.7, z = 1.86, νS = 5.5, and νG = 5.5. It is important to ...
  16. [16]
    A phenomenological constitutive model for predicting both the ...
    Phenomenological models involve observing the experimental behavior of the material under different conditions of homogenous deformations and thereafter ...
  17. [17]
    A Phenomenological Approach to Study Mechanical Properties of ...
    Mar 13, 2019 · This work proposes a modeling of the mechanical properties of porous polymers processed by scCO2, using a phenomenological approach.
  18. [18]
    [PDF] Phenomenological thermodynamics in a nutshell - Arnold Neumaier
    Dec 15, 2014 · The analysis of the efficiency by means of the so-called Carnot cycle was the historical origin of thermodynamics. The state space is often ...
  19. [19]
  20. [20]
    [PDF] Drude, P., 1900, Annalen der Physik 1, 566
    Zur Elektronentheorie der Metalle; von P. Drude. I. Teil. Dass die Elektricitätsleitung der Metalle ihrem Wesen nach nicht allzu verschieden von der der ...
  21. [21]
    (PDF) Phenomenological kinetics - An alternative approach
    Aug 5, 2025 · Phenomenological kinetics describes the overall progress of a transformation with simple linear differential equations.
  22. [22]
    Obtaining reliable phenomenological chemical kinetic models for ...
    In fact, a paper was written to debunk earlier papers suggesting that the Arrhenius equation was not applicable to constant heating rates [11].<|control11|><|separator|>
  23. [23]
    From growth equations for a single species to Lotka–Volterra ...
    The Lotka–Volterra model is the simplest model of predator–prey ... As we said, the Lotka–Volterra competition equations are a phenomenological approach ...
  24. [24]
    Phenomenological Models of the Global Demographic Dynamics ...
    Kapitsa's phenomenological model (21) can be successfully used for calculating the demographic dynamics of separate countries that can ensure sustainable ...
  25. [25]
    (PDF) Energy Balance Climate Models - ResearchGate
    Aug 6, 2025 · An introductory survey of the global energy balance climate models is presented with an emphasis on analytical results.
  26. [26]
    The fractional energy balance equation for climate projections ... - ESD
    Jan 19, 2022 · The FEBE can be derived from Budyko–Sellers models or phenomenologically through the application of the scaling symmetry to energy storage ...
  27. [27]
    Foundations of magnetohydrodynamics | Physics of Plasmas
    Jul 2, 2025 · Magnetohydrodynamic (MHD) equations are a cornerstone of plasma physics, providing a bridge between electromagnetism and fluid dynamics.
  28. [28]
    On the Theory of superconductivity - Inspire HEP
    On the Theory of superconductivity. V.L. Ginzburg(. Lebedev Inst. ) ,. L.D. Landau(. Lebedev Inst. ) 1950. 19 pages. Part of Collected Papers of L.D. Landau, ...
  29. [29]
    Chiral Perturbation Theory to One Loop - Inspire HEP
    Chiral Perturbation Theory to One Loop. J. Gasser(. Bern U. ) ,. H. Leutwyler(. CERN. ) Aug, 1983. 92 pages. Published in: Annals Phys. 158 (1984) 142. DOI:.
  30. [30]
    Mechanistic Modeling - an overview | ScienceDirect Topics
    Mechanistic modeling is based on the laws of physics and electrochemistry. ... Mechanistic models, also known as first-principles, white-box or phenomenological ...
  31. [31]
    [PDF] Molecular Simulation-Guided and Physics-Informed Mechanistic ...
    May 31, 2022 · these multifunctional polymers, a mechanistic model- ing method is ... Molecular dynamics pre- dictions of thermal and mechanical ...
  32. [32]
    Should We Care If Models Are Phenomenological or Mechanistic?
    Phenomenological models are focused on the patterns (ie, the phenomena) that emerge from an underlying process rather than the process itself.
  33. [33]
    What Are the Navier-Stokes Equations? - COMSOL
    Jan 15, 2015 · The Navier-Stokes equations govern the motion of fluids and can be seen as Newton's second law of motion for fluids.
  34. [34]
    [PDF] Mechanical vs. phenomenological formulations to determine mean ...
    Aug 30, 2021 · Abstract The present work compares two different drag breakdown methods based on the wake survey of a finite size wing.
  35. [35]
    [PDF] Why Use Phenomenological Models Over Empirical Options - R3
    Jan 20, 2025 · Doing so results in a first- principles type of phenomenological model. ... Simplicity: Empirical models are composed of repeating units of the ...
  36. [36]
    On parameter interpretability of phenomenological-based ...
    Empirical-based models are derived from data, while phenomenological-based models are derived from knowledge about the process. In biomedical fields, ...
  37. [37]
    On Parameter Interpretability of Phenomenological-Based Semiphysical Models
    **Summary of Definitions and Contrasts from https://www.biorxiv.org/content/10.1101/446583v1.full:**
  38. [38]
    Comparison between Phenomenological and Empirical Models for ...
    Phenomenological models are expected to show higher extrapolation capability in comparison with empirical models, what can be an important advantage for process ...
  39. [39]
  40. [40]
  41. [41]
    A phenomenological scaling approach for heat transport in nano ...
    A phenomenological approach of heat transfer in nano-systems is proposed, on the basis of a continued-fraction expansion of the thermal conductivity, ...
  42. [42]
    Phenomenological Laws and Mechanistic Explanations
    Apr 11, 2023 · This article examines the relationship between mechanistic explanation and phenomenological laws, and how mechanistic theories sort them into  ...
  43. [43]
    High-dimensional Bayesian Inference for crystal plasticity parameter ...
    Sep 12, 2025 · ... phenomenological CP models and quantified ... parameter non-uniqueness, where different parameter sets yield similar macroscopic responses.