Fact-checked by Grok 2 weeks ago

Normalcy bias

Normalcy bias, also known as normality bias, is a in which individuals underestimate the likelihood and potential consequences of a or , assuming that normal conditions will persist despite evidence to the contrary. This mental tendency leads people to deny or minimize threat warnings, often resulting in delayed reactions or inadequate preparation when disruptions occur. The bias originates from sociological observations of human responses to emergencies, as first detailed by Thomas E. Drabek in his inventory of findings on disaster behaviors, where he described the initial denial of warnings as a common pattern. Evolutionarily, it may arise from the brain's adaptation to focus on immediate, routine threats rather than rare, high-impact events, promoting psychological stability but hindering awareness of gradual or unprecedented risks. Related cognitive mechanisms, such as and resistance to change, reinforce this by interpreting anomalies as temporary or insignificant. In practice, normalcy bias has contributed to real-world incidents, such as the 1999 rail disaster in the UK, where repeated signal failures were dismissed as minor due to their familiarity, leading to a fatal collision. It also manifests in underestimating broader threats like pandemics or , where people overlook accumulating evidence of harm in favor of maintaining routine behaviors. The consequences include heightened vulnerability in crises, as seen in delayed evacuations during tsunamis or fires, ultimately exacerbating damage to life and property by impeding timely adaptive actions.

Definition and Overview

Definition

Normalcy bias, also known as normality bias, is a that leads individuals to underestimate or disbelieve the severity of potential threats, believing that things will continue to function as they always have. This mental state causes people to minimize the likelihood or impact of negative events, such as disasters, by assuming normal conditions will persist despite mounting evidence to the contrary. The bias affects approximately 70% of people in disaster scenarios, often resulting in delayed or absent protective responses that exacerbate harm. Unlike , which entails a broad tendency to overestimate positive outcomes and underestimate personal risks in general, normalcy bias specifically involves a refusal to acknowledge and adapt to observable disruptions in the . The term gained prominence in disaster psychology literature around the early , emerging from empirical observations in and studies during crises. It often manifests progressively through phases like initial , highlighting its dynamic role in .

Key Characteristics

Normalcy bias manifests through a core set of behavioral traits that prioritize continuity and familiarity in the face of potential threats. Individuals exhibiting this bias tend to interpret warnings or signals of danger as false alarms, dismissing them to preserve a sense of stability. This is often accompanied by a strong preference for adhering to familiar routines rather than engaging in adaptive or disruptive actions, as the bias reinforces the belief that disruptions will be minimal or temporary. Anomalous events, such as unusual sounds or environmental changes, are frequently minimized as isolated glitches or non-threats, further entrenching inaction. Perceptually, normalcy bias is marked by signs of internal masked by outward composure, commonly referred to as "frozen calm." In this state, individuals appear outwardly serene while internally struggling to process the threat, leading to delayed . This is coupled with selective , where normalcy cues—such as the absence of prior similar incidents—are amplified, while threat indicators are systematically ignored or rationalized away. Empirical studies in simulated emergencies demonstrate that normalcy bias significantly prolongs response times, contributing to evacuation delays as individuals underestimate the urgency of cues. For instance, research on evacuations shows this bias reduces perceived and extends the pre-evacuation phase, slowing overall response. Unlike ideological denialism, which involves conscious rejection of based on preconceived beliefs, normalcy bias is a cognitive process driven by an automatic preference for continuity and underestimation of change. This distinction highlights its roots in perceptual filtering rather than deliberate opposition.

Psychological Foundations

Hypothesized Causes

One primary hypothesized cause of normalcy bias is the brain's intrinsic preference for information compatible with familiar patterns and prior experiences, driven by associative structures that prioritize coherence and ease of processing. This compatibility principle underlies biases like normalcy bias, where individuals favor the to avoid from novel threats. Under conditions of threat or stress, cognitive biases including normalcy bias may be amplified as elevated levels impair function, reducing the capacity for deliberate, rational threat assessment and favoring automatic, habitual responses instead. This shift promotes reliance on intuitive processing, where the brain defaults to minimizing perceived disruptions to maintain psychological . From an evolutionary , normalcy bias likely originated as an adaptive in ancestral environments, where assuming stability conserved energy and enhanced survival by avoiding overreactions to minor anomalies in low-risk settings. However, this mechanism becomes maladaptive in modern scenarios involving rapid or unprecedented changes, such as disasters, due to a mismatch between evolved cognitive shortcuts and contemporary complexities. Recent research integrates normalcy bias with dual-process theory, positing it as a product of overreliance on thinking—fast, intuitive, and pattern-based—particularly in contexts where gradual threats like are downplayed. Studies from 2023 highlight how this bias hinders proactive environmental decision-making by favoring immediate, familiar interpretations over analytical System 2 evaluation of long-term risks.

Cognitive Processes

Normalcy bias involves a processing sequence in which initial detection is filtered through mental constructs of familiarity and routine, often resulting in the minimization of anomalous signals that deviate from established norms. This filtering mechanism prioritizes continuity with prior experiences, leading individuals to interpret emerging as temporary aberrations rather than indicators of genuine risk. When new information conflicts with these baseline expectations, emerges, prompting efforts to resolve the inconsistency by downplaying the or seeking confirmatory evidence that aligns with the . Heuristics play a central role in sustaining this bias, with the contributing to the underweighting of rare or unprecedented events by favoring more readily recalled, commonplace scenarios over low-probability disasters. Similarly, anchoring on "normal" baselines resists belief updating, as initial perceptions of stability anchor subsequent judgments and impede the integration of disconfirming evidence about escalating dangers. These shortcuts enable rapid but systematically distort threat assessment in uncertain environments. From a neurocognitive , under , an can occur, where heightened emotional arousal impairs function and reduces capacity, thereby limiting the thorough evaluation of threats and reinforcing biased interpretations of normalcy. This model aligns with ergonomics frameworks like the cognitive bias-incorporated SHEL (Software, Hardware, Environment, Liveware) approach, which highlights how -induced cognitive limitations exacerbate failures in during crises. Normalcy bias interacts with other cognitive tendencies, such as amplifying by entrenching preferences for maintaining current conditions, though it remains distinct in its acute manifestation during contexts where threats challenge perceptual normality.

Phases of Response

Denial Phase

The denial phase represents the initial response to emerging threats, characterized by the immediate dismissal of warnings or unusual signals as exaggerations or false alarms. Individuals often rationalize early indicators of danger by fitting them into familiar, non-threatening patterns, such as interpreting an alarm as a routine or attributing unusual environmental cues to benign causes. This stems from normalcy bias, where the brain prioritizes consistency with routine experiences over novel risks. According to Amanda Ripley's model outlined in her 2008 book The Unthinkable, this phase is triggered by the first anomalous cues, such as distant sirens or subtle changes in surroundings, and typically lasts from seconds to minutes—such as 3-5 minute delays observed on lower floors during the —before escalating awareness forces a transition. During this brief window, the bias anchors responses in expectations of continuity, preventing proactive measures and contributing to delayed evacuations in high-stakes scenarios. Psychologically, the denial phase is rooted in pre-existing worldviews that emphasize and past normalcy, leading individuals to underestimate threats that contradict established beliefs. In evacuation simulations, many participants ignore initial alerts, as they cling to assumptions of rather than interpreting signals as genuine dangers. Recent 2025 research on highlights how this phase manifests in patterns of climate denial, where public surveys reveal widespread minimization of environmental warnings due to entrenched views of , mirroring broader to policy changes.

Deliberation Phase

Following the denial phase, where the threat is initially dismissed, the deliberation phase emerges as individuals gradually acknowledge the danger but struggle to reconcile it with their ingrained sense of normalcy. This intermediate stage involves active but often stalled processing of the threat, marked by an between mounting evidence of risk—such as alarms, smoke, or visible chaos—and the psychological pull to preserve routines and expectations of safety. As described in Ripley's model of , this phase arises when the brain shifts from outright rejection to tentative assessment, yet the toward normalcy creates , leading to a form of cognitive where action feels disruptive or unnecessary. A hallmark of this phase is the onset of stress-induced physiological symptoms that further complicate threat evaluation. Elevated heart rates trigger the , resulting in narrowed peripheral vision known as , which limits awareness of surroundings, and auditory exclusion, where non-critical sounds are filtered out, potentially causing individuals to miss vital cues like evacuation instructions. These responses, part of the , intensify the conflict by impairing rational deliberation and reinforcing fixation on familiar patterns rather than adaptive escape strategies. Research on high-stress scenarios confirms that such symptoms peak during this weighing process, diverting cognitive resources away from effective . Behavioral outcomes typically manifest as delayed , with individuals hesitating to initiate protective actions, often freezing or performing irrelevant tasks like collecting personal items. In emergency drills simulating high-stress conditions, this hesitation contributes to prolonged response times, averaging several minutes before movement begins, which can escalate risks in time-sensitive situations. Influencing factors include from peers still exhibiting denial, which reinforce collective inaction and extend the paralysis; for instance, observing others remain calm or unhurried signals that the may not warrant immediate change. Ripley's model further emphasizes failures, where mental energy is misdirected toward maintaining normalcy—such as mentally rehearsing daily obligations—rather than reallocating to priorities like route assessment or alerting others. Modern research underscores the phase's role in transport safety, particularly in rail incidents, where normalcy bias during deliberation has been linked to significant delays in passenger response, exacerbating outcomes in collisions or derailments. A 2024 analysis of cognitive biases in transport behavior highlights how this stage contributes to inadequate preparation and slowed evacuations in rare but severe events like the , where prior signals of risk were downplayed in favor of routine operations.

Decisive Moment

The decisive moment, as termed in Ripley's survival framework, constitutes the culminating phase of the response to a , where individuals must execute rapid, resolute action to evade peril after navigating and deliberation. This stage arrives as the buildup from prior indecisiveness reaches its peak, demanding an override of lingering normalcy bias to initiate escape or protective measures. Failure to act decisively here often results in severe outcomes, including or injury, as the window for effective intervention closes abruptly in high-stakes scenarios. In acute threats such as fires or attacks, this moment typically manifests within a brief, time-sensitive —often described as seconds to minutes—where transitions into a , rendering prior opportunities for survival untenable. Overcoming the bias at this juncture frequently involves physiological triggers like an adrenaline surge, which can propel flight responses, or external stimuli such as direct verbal commands from others, prompting immediate compliance. Ripley's analysis underscores the necessity of instinct override, where trained mental scripts or disrupt , enabling proactive engagement rather than freeze. Studies indicate that persistence of normalcy bias through this correlates with high non-survival rates, leading to delayed or absent in simulations and real events. This vulnerability highlights the phase's lethality, as those unable to break free remain exposed while threats escalate. Recent applications extend the concept beyond personal crises to organizational , where 2025 analyses frame the decisive as a pivotal in responding to systemic disruptions, such as overlooked early indicators of market volatility or operational failures. In these contexts, leaders who fail to act decisively risk cascading institutional , mirroring individual dynamics on a broader scale.

Impacts and Manifestations

Individual Effects

Normalcy bias manifests in short-term behavioral consequences during crises, particularly through delayed recognition and response to threats, which heightens individual injury risk. Individuals affected by this bias often misinterpret as non-emergent, leading to prolonged hesitation before evacuating hazardous environments such as fires or . For instance, in fire evacuation studies, normalcy bias contributes to extended pre-evacuation times by causing people to dismiss cues like smoke or alarms, thereby increasing exposure to danger and the likelihood of physical harm or fatalities. This delay is exacerbated in unfamiliar scenarios, where the bias reinforces an assumption of safety, overriding instinctive protective actions. Post-event, normalcy bias can intensify for survivors, including feelings of stemming from perceived inaction during the crisis. Those who initially underestimated the threat may later grapple with the emotional fallout of delayed decisions, amplifying distress as they confront the gap between their normalcy assumptions and the actual harm incurred. This guilt often arises in the aftermath of disasters, prolonging exposure to relational and emotional costs of survival. In the long term, normalcy bias is associated with elevated risks of anxiety disorders among disaster survivors, as the bias's role in inadequate threat assessment contributes to more severe post-traumatic responses. It also erodes , diminishing individuals' confidence in managing future risks by fostering a of underpreparedness and lowered motivation for protective behaviors. Health correlations reveal that disaster survivors exhibit higher incidences of stress-related illnesses, with up to 33% meeting criteria for post-disaster psychiatric diagnoses like PTSD, potentially worsened by biases that delay and heighten vulnerability. Survivors may experience recurring over missed actions, perpetuating emotional strain.

Societal Consequences

Normalcy bias at the societal level manifests in collective failures during emergencies, where widespread underestimation of threats leads to mass non-compliance with evacuation orders, thereby amplifying casualties and disrupting response efforts. In evacuation scenarios, this bias prolongs the pre-evacuation phase as individuals interpret alarms or cues as routine occurrences rather than indicators of imminent danger, reducing overall and extending exposure to hazards. Such delays have been observed to contribute to higher-than-necessary tolls in disasters, as populations fail to act promptly despite clear warnings. The economic repercussions of normalcy bias are substantial, as delayed societal responses to foreseeable risks exacerbate from like and climate-related incidents. By underestimating the likelihood and severity of disruptions, communities and governments often postpone investments in , resulting in escalated costs and lost productivity. For instance, in the context of , this bias fosters inaction on mitigation measures, leading to amplified financial burdens from intensified weather , with estimates indicating potential global economic losses in the trillions of dollars due to inaction on climate risks. Institutionally, normalcy bias undermines emergency planning and fosters policy inertia, as organizations and policymakers prioritize maintaining the over proactive reforms in areas such as pandemic preparedness and disaster management. This reluctance to deviate from established norms hampers the development of robust contingency frameworks, perpetuating vulnerabilities across sectors like and . When aggregated from individual tendencies, these institutional shortcomings result in systemic delays that compromise national . Culturally, normalcy bias is reinforced through portrayals that emphasize continuity and "business as usual," embedding expectations of stability across generations and diminishing collective awareness of emerging threats. Such narratives normalize underestimation of rare but high-impact events, sustaining a societal preference for short-term comfort over anticipatory .

Examples in Practice

Historical Cases

One of the earliest documented manifestations of normalcy bias occurred during the sinking of the RMS Titanic on April 15, 1912. Despite multiple warnings and the ship's collision with an at 11:40 p.m., many passengers and crew initially dismissed the danger, clinging to the widespread belief that the vessel was "unsinkable" due to its advanced design and watertight compartments. This delayed the full loading of lifeboats, with only about half the capacity utilized initially, as people continued normal activities like dressing formally or retrieving belongings. The bias progressed through phases of denial and deliberation, contributing to the loss of over 1,500 lives out of 2,224 aboard, far exceeding what might have occurred with prompt evacuation. The Japanese attack on Pearl Harbor on , 1941, provides another historical illustration, particularly in the military's response to early warnings. Radar operators at Opana Point detected the incoming fleet at 7:02 a.m. but attributed the large blips to a scheduled flight of U.S. B-17 bombers arriving from the mainland, reflecting a normalcy bias that prioritized expected routines over anomalous threats. This underestimation, combined with broader disbelief in an imminent assault despite hints, delayed alert notifications to commanders, allowing the surprise strike to inflict severe damage on the U.S. Pacific Fleet, including the sinking or crippling of eight battleships and the deaths of 2,403 Americans. The event exemplifies how normalcy bias can hinder rapid response in high-stakes scenarios. These cases demonstrate the progression through response phases—denial, deliberation, and eventual incisive action—exacerbated by normalcy bias, as analyzed in disaster psychology studies.

Contemporary Instances

During in 2005, normalcy bias contributed to widespread failure to evacuate New Orleans, with approximately 20% of residents remaining despite warnings about vulnerabilities, as individuals downplayed the risk based on prior storm experiences. Survivor accounts highlighted of the storm's severity, reflecting individual effects where people clung to routines amid escalating threats. In the from 2020 to 2023, normalcy bias manifested in initial public dismissal of mask mandates and lockdowns as unnecessary overreactions, with many underestimating transmission risks by comparing the virus to routine illnesses. This bias led to delayed adoption of protective measures, exacerbating early spread in communities reluctant to disrupt daily life. The 2023 Maui wildfires exemplified normalcy bias through delayed evacuations in Lahaina, where residents underestimated fire spread in familiar coastal environments, assuming containment based on initial reports. Blocked routes and communication failures compounded this, but personal hesitation rooted in expecting "business as usual" prolonged exposure to danger. Recent 2024-2025 climate events, including intensified floods and heatwaves, have shown normalcy bias in public responses, with affected individuals normalizing as "typical" despite increasing frequency linked to . For instance, during the July 2025 Texas flash floods, which caused over 100 deaths, residents downplayed the risks by viewing such events as something that "happens all the time," despite their unprecedented intensity driven by . Sustainability reports note this bias hinders adaptive behaviors in vulnerable areas. Similarly, studies on transport safety attribute delays in safety protocols to normalcy bias, as seen in historical rail incidents like the 1999 crash, where operators overlooked severe risks such as derailments amid routine operations.

Mitigation Strategies

Personal Techniques

Individuals can engage in awareness training to recognize and counteract normalcy bias through daily mental exercises, such as , where one imagines disruptions to routine and develops plans for potential threats. This approach, drawn from debiasing strategies, encourages broadening future thinking by considering multiple outcome estimates or conducting premortems to anticipate failures. Mobile applications designed for awareness, like those providing checklists for common mental shortcuts, facilitate regular self-reflection to identify patterns of underestimating risks. Behavioral hacks offer practical ways to pre-commit to action in high-stress situations, such as creating detailed evacuation drills and rehearsing them periodically to build automatic responses. practices, including brief daily , help reduce stress-induced by enhancing present-moment awareness and interrupting automatic of threats. These techniques target different response phases, from initial to deliberation, by fostering proactive habits. Self-assessment tools, such as questionnaires evaluating susceptibility to cognitive biases in , enable individuals to gauge their normalcy bias levels through structured prompts on and . For instance, the Assessment of Biases in Cognition (ABC) inventory uses multiple-choice items to measure and behavioral tendencies toward general cognitive biases. Studies on debiasing indicate that training interventions can improve decision-making speed and accuracy.

Systemic Interventions

Systemic interventions to mitigate normalcy bias involve institutional, organizational, and governmental measures designed to counteract collective underestimation of risks by embedding structured processes, , and policy reforms into frameworks. These approaches aim to foster proactive in areas like , cybersecurity, and , where normalcy bias can lead to inadequate and response. By prioritizing evidence-based strategies, such organizations reduce the tendency to assume continuity of normal conditions during threats. One key strategy is the implementation of and exercises, which challenge assumptions of normalcy by requiring participants to envision and prepare for low-probability, high-impact events. For instance, broad forecasting techniques—considering low, medium, and high outcome estimates—help organizations avoid overreliance on baseline expectations, as outlined in debiasing frameworks that address errors in uncertain environments. Red teaming, a structured adversarial , further counters normalcy bias by assigning teams to critique plans and expose blind spots, promoting and alternative perspectives in and business contexts. exercises, where teams prospectively identify potential failures, have been shown to enhance foresight in organizational settings, mitigating underestimation of disruptions. Mandatory programs represent another systemic tool, particularly in high-stakes sectors like and cybersecurity, where repeated drills and build institutional against complacency. In disaster , governments can integrate bias-awareness modules into protocols to address normalcy bias at scale, ensuring that response teams and policymakers routinely evaluate unconventional threats. For example, following the 2021 Uri in , an audit revealed that underestimation of severe weather risks—rooted in plans assuming only minor disruptions—led to critical shortages in staffing, equipment, and facilities. In response, recommendations included prioritizing funding for resilience hubs, tracking implementation of past corrective actions, and enhancing cross-departmental to institutionalize adaptive . Structured tools, such as bias-mitigation , provide organizations with standardized processes to identify and counter underestimation biases during assessments. In the sector, a validated checklist incorporates —comparing projects to historical analogs—and diverse input methods to reduce and fallacies akin to normalcy bias, with experts rating it moderately to highly effective for improving identification. Policy reforms at the governmental level, including equitable communication plans and for vulnerability assessments, further embed these interventions, ensuring that systemic biases do not exacerbate societal vulnerabilities during crises.

References

  1. [1]
    Cognitive bias and how to improve sustainable decision making - NIH
    Normalcy bias: the tendency to underestimate both the likelihood of a disaster and its possible consequences, and to believe that things will always function ...
  2. [2]
    Safety Requires a State of Mindfulness - PMC - NIH
    A normalcy bias causes us to assume that, although a catastrophic event has happened to others, it will not happen to me. If it does, we are shocked and unable ...
  3. [3]
    Recognised cognitive biases: How far do they explain transport ...
    Normalcy Bias is the refusal to plan for a disaster which has never or very rarely happened before (Drabek, 1986). In the Paddington rail disaster in ...
  4. [4]
    Influence of Cognitive Biases in Distorting Decision Making and ...
    Normalcy biases represent our propensity to regard minor abnormalities as normal. By this phenomenon, we try to prevent ourselves from reacting excessively to ...
  5. [5]
    Normalcy Bias - The Decision Lab
    The normalcy bias describes our tendency to underestimate the possibility of disaster and believe that life will continue as normal.
  6. [6]
    What Is Normalcy Bias? | Definition & Example
    Mar 24, 2023 · Normalcy bias denotes our tendency to minimize or ignore threat warnings and to believe that nothing can seriously disrupt our everyday life.What is normalcy bias? · What causes normalcy bias? · Normalcy bias example
  7. [7]
    Explaining a collective false alarm: Context and cognition in the ...
    Aug 22, 2024 · The analysis suggests that crowd behaviour in false alarms has more in common with the meaningful behaviour typically found in real emergencies.2 Methods · 3 Analysis · 4 Discussion<|separator|>
  8. [8]
  9. [9]
    Structural equation modeling of negative emotion and walking ...
    Okabe and Mikami (1982) defined the normalcy bias as the tendency to interpret cues as indicative of everyday events and underestimate the likelihood and ...
  10. [10]
    Risk perception in fire evacuation behavior revisited - PubMed Central
    Normalcy bias reduces perceived risk and refers to a tendency to attribute ... Response Phase Behaviours and Response Time Predictors of the 9/11 World Trade ...
  11. [11]
  12. [12]
  13. [13]
    A Neural Network Framework for Cognitive Bias - PubMed Central
    ... normalcy bias, illusion of truth, and the 'not invented here' bias all have ... Mapping mental function to brain structure: how can cognitive neuroimaging succeed ...
  14. [14]
    Proposal of cognitive bias (CB)-incorporated SHEL model to prevent ...
    Mar 5, 2025 · We proposed a cognitive bias (CB)-incorporated SHEL model for the prevention of crashes or disasters from the viewpoints of inadequate interactions.Missing: normalcy | Show results with:normalcy
  15. [15]
    Risk perception in fire evacuation behavior revisited: definitions ...
    Jan 8, 2015 · The use of heuristics may explain another type of bias known as normalcy bias, which refers to the tendency to interpret cues as indicative for ...Missing: alerts | Show results with:alerts
  16. [16]
    Three Stages of Disaster Response - Avoid. Deny. Defend.
    Denial, Deliberation, The Decisive Moment. ​In her book on disaster survival, Amanda Ripley ... Ripley attributes this to normalcy bias. That is, our brains tend ...
  17. [17]
    (PDF) A Critical Review Of Emergency Evacuation Simulation Models
    ... normalcy bias, in which people misunderstand the signs of dangers produced by the hazards and. developing disasters and interpret them as normal features of ...
  18. [18]
    (PDF) Climate Change Denial and Cognitive Biases - ResearchGate
    Mar 7, 2025 · This research examines the psychological and social determinants of climate denial, ie, cognitive bias, political identity, and misinformation.Missing: normalcy | Show results with:normalcy
  19. [19]
    How We Really Respond In A Crisis - Forbes
    Sep 2, 2008 · ... Amanda Ripley, a Time magazine reporter who covers homeland security ... In the deliberation phase, as people's bodies begin responding ...
  20. [20]
    Stress-Activity Mapping: Physiological Responses During General ...
    Within these scenarios, stress reactivity can result in perceptual distortions (e.g., tunnel vision, auditory exclusion) as well as increased performance errors ...
  21. [21]
    Tunnel vision and chronic stress: How to manage your physiological ...
    Oct 13, 2017 · Auditory exclusion is a stress response associated with tunnel vision and involves temporarily not hearing nearby noises or voices. To avoid ...
  22. [22]
    [PDF] Overall and local movement speeds during fire drill evacuations in ...
    Feb 4, 2012 · Values reported for office occupancies average 165 ± 71 s (uncer- tainty is expressed as standard deviation). The cue received had a significant ...
  23. [23]
    Excerpt: 'The Unthinkable' - NPR
    Jul 22, 2008 · The three chronological phases—denial, deliberation, and the decisive moment—make up the structure of this book. Real life doesn't usually ...
  24. [24]
    Most people freeze in a crisis. Here's why — and how to stop it - Big ...
    May 15, 2025 · So there's something called a normalcy bias, which you see in all ... cortisol, you start to deteriorate. You lose eye hand control ...
  25. [25]
    [PDF] who survives disasters and why - RIE Toronto
    PHASES OF RESPONSE TO A DISASTER. – Decisive moment. • We've accepted that we are in danger; we've deliberated over our options. Now we take action. • Panic ...
  26. [26]
    The frozen calm of normalcy bias - Gizmodo
    May 2, 2013 · Rounding out the theories about normalcy bias is the idea that people need information in order to act. If people don't know how to deal with a ...
  27. [27]
    Normalcy Bias, Complacency, and the POP-DOC Lens:
    Oct 12, 2025 · It means warning signs are dismissed as false alarms until they are not. ... In doing so, it dismantles the false peace of normalcy bias ...
  28. [28]
    Cognitive Biases Within Decision Making During Fire Evacuations
    In the literature, this common pattern of responses is often attributed to a normalcy bias ... false alarm: Context and cognition in the Oxford Street ...<|separator|>
  29. [29]
    Guidance for the Model User on Representing Human Behavior in ...
    The building is an open plan office so occupants are likely to see and hear others evacuating – including fire wardens. Therefore, normalcy bias and optimistic ...<|separator|>
  30. [30]
    The continuity principle: A unified approach to disaster and trauma
    ... normalcy bias” which results in underestimating the probability or extent of expected disruption. This article clarifies these biases and details the ...
  31. [31]
    Examining a Comprehensive Model of Disaster-Related ...
    Sep 12, 2012 · One third (33%) of the survivors met criteria for a postdisaster diagnosis. PTSD was the most prevalent postdisaster disorder (20%), followed in ...<|control11|><|separator|>
  32. [32]
    Why You Spot Trouble Yet Do Nothing | Psychology Today
    Mar 31, 2025 · Normalcy bias (assuming things will continue as they always have). ... Inaction, by contrast, offers short-term relief but long-term regret.
  33. [33]
    Understanding managers' motivation in adopting protective measures
    Furthermore, normalcy bias lowers deliberative risk perception and indirectly affects protection motivation, whereas, short-termism increases perceived resource ...
  34. [34]
    Cognitive bias and how to improve sustainable decision making
    In the present study, we try to explain how systematic tendencies or distortions in human judgment and decision-making, known as “cognitive biases,” contribute ...
  35. [35]
    [PDF] Creative Project Management
    evidence to predict the Pearl Harbor attack, but picking salient informa- ... The first consequence of the normalcy bias is failing to prepare ade- quately ...
  36. [36]
    Revisiting the concept of normalcy bias. - APA PsycNet
    The concept of normalcy bias occupies a central position in research on disaster psychology. This paper reexamined the theoretical validity of this concept ...
  37. [37]
    [PDF] Hurricane Katrina And The Perception Of Risk - ucf stars
    • Sources of bad news tend to be more credible than sources of good news. • Distrust, once initiated, tends to reinforce and perpetuate distrust. He asserts ...
  38. [38]
    [PDF] Not My Fault: Having that evacuation conversation again
    Sep 5, 2021 · The Japanese call this the normalcy bias – we assume that nothing unusual is happening. A study by Japanese scientists after the 2011 earthquake ...Missing: 50% ignoring<|separator|>
  39. [39]
    Back to normal: Why we must accept it won't happen - CNN
    Sep 30, 2020 · Those who refuse to wear masks may be guilty of normalcy bias, Davenport said, since they perceive this intrusion into lives as a passing fad ...
  40. [40]
    [PDF] Polarized Reactions Towards COVID-19: A Behavioral Analysis
    The normalcy bias leads people to minimize the threats and their warnings, as they underestimate the probability of a disaster happening. As much as they want ...Missing: mandates dismissal
  41. [41]
    Why weren't Maui residents warned about the fire sooner? - BBC
    Aug 14, 2023 · As the scale of the fire's destruction in Maui becomes clear, questions are mounting over whether officials warned residents fast enough.
  42. [42]
    Preliminary After-Action Report: 2023 Maui Wildfire
    Feb 8, 2024 · The preliminary report makes 32 recommendations to improve Maui's police response to future natural disaster response efforts.
  43. [43]
    The US faces more frequent extreme weather events, but attitudes ...
    Jul 9, 2025 · Climate change is making extreme weather events more frequent and intense, according to climate scientists and government data.
  44. [44]
  45. [45]
    Normalcy Bias The Big Lie Do You Want To Fail? - NW Survival LLC
    Sep 21, 2024 · Normalcy Bias refers to the tendency for people to believe that things will continue as they always have, which can lead to underestimating risks or ignoring ...Missing: rail incidents<|control11|><|separator|>
  46. [46]
    Three Ways Mindfulness Can Make You Less Biased
    May 15, 2017 · Cognitive biases may be partly to blame for prejudice, and research suggests that mindfulness can help us correct them.
  47. [47]
    [PDF] The Assessment of Biases in Cognition - MITRE Corporation
    The RD scales consist primarily of multiple- choice items and are intended to assess declarative knowledge of the biases.
  48. [48]
    Improved Decision Making With a Single Training Intervention
    Aug 6, 2025 · The results suggest that a single training intervention can improve decision making. We suggest its use alongside improved incentives, information presentation ...
  49. [49]
    A User's Guide to Debiasing - Wiley Online Library
    Dec 18, 2015 · This chapter provides a guide to these strategies. It begins with a brief discussion of the sources of bias in decision making.
  50. [50]
    [PDF] THE RED TEAM HANDBOOK - Army.mil
    SWOT helps to holistically reduce personal and cultural biases. The Method. SWOT is a framework that adds value by essentially forcing the Red Team to think ...
  51. [51]
    None
    ### Summary: Normalcy Bias and Disaster Preparedness in Austin During Winter Storm Uri
  52. [52]