Fact-checked by Grok 2 weeks ago

Pilot error

Pilot error refers to an action, decision, or failure to act by a pilot that causes or substantially contributes to an , often involving a influenced by multiple factors. It is the most common cause of accidents in the United States, with pilot-related factors contributing to approximately 70 percent of accidents as of 2022, according to data from the (AOPA) and (NTSB). This prevalence is especially pronounced in , where pilot-related factors directly contribute to around 70 percent of accidents in recent years, compared to about 25 percent in commercial air carrier operations as of the early due to enhanced training and . Common manifestations of pilot error include loss of aircraft control, (CFIT), and inadvertent flight into (IMC) under (VFR). These errors often stem from inadequate , such as misjudging weather conditions or exceeding aircraft performance limits, as well as lapses in preflight . Underlying contributors frequently involve human factors like fatigue, stress, , and complacency, which can impair judgment and during critical phases of flight. Efforts to mitigate pilot error emphasize aeronautical decision-making (ADM) training, which promotes hazard identification and to break the "error chain" before it leads to an . Regulatory bodies like the (FAA) integrate human factors education into pilot certification, while advanced cockpit technologies, such as terrain awareness and warning systems (TAWS), provide real-time alerts to prevent errors. Despite these measures, ongoing highlights the need for continued focus on (CRM) in multi-pilot environments to address latent organizational influences on individual performance.

Description

Definition and Characteristics

Pilot error is defined as any action, inaction, or decision by a pilot that leads to or substantially contributes to an accident or incident, encompassing human factors such as misjudgment, procedural deviations, or failures in . According to the (NTSB), this includes any pilot performance identified as the or a contributing factor in accident investigations, distinguishing it from purely mechanical or environmental issues where pilot involvement is not primary. The (ICAO) aligns with this through Annex 13, which frames accident investigations to focus on prevention rather than blame, requiring pilot actions to be a direct causal element separate from external failures like equipment malfunctions or weather conditions. Key characteristics of pilot error involve classifications rooted in human factors models, emphasizing it as a systemic issue rather than individual fault. These include skill-based errors, such as slips (unintended actions, like incorrect switch activation due to inattention) and lapses (failures in memory or attention, like omitting a step during routine tasks); mistakes, which are planning errors divided into rule-based (misapplying a known procedure to the wrong context) and knowledge-based (flawed decision-making from inadequate information); and violations, deliberate departures from established procedures or regulations. This , drawn from James Reason's seminal framework, highlights that pilot errors often arise from interactions between cognitive processes and operational demands, not inherent incompetence, and underscores the need to view them within broader safety systems. The term "pilot error" emerged in investigations following , as analyses shifted from wartime mechanical assessments to scrutinizing human performance in peacetime operations, with early reports attributing nearly half of accidents to pilot actions. It was formalized in the through ICAO standards under Annex 13 (first adopted in ), which standardized global protocols for identifying human contributions to accidents while promoting non-punitive inquiries to enhance safety. This evolution marked a transition from simplistic blame to recognizing pilot error as an inevitable human factor amenable to mitigation through training and design improvements.

Prevalence and Statistics

Pilot error remains the leading cause of aviation accidents, with human error contributing to 70-80% worldwide, according to FAA and other analyses. In general aviation, this figure rises to nearly 70%, highlighting the heightened vulnerability in non-commercial operations where pilots often operate without the support structures of scheduled airlines. According to the International Civil Aviation Organization (ICAO), human error more broadly contributes to 70-80% of accidents across all sectors. Historical data from Boeing's Statistical Summary of Commercial Jet Airplane Accidents (1959-2024) indicates a long-term decline in overall accident rates, with a 40% reduction in total accidents and a 65% drop in fatal accidents over the past two decades, partly due to advancements mitigating pilot-related risks that previously dominated at around 80% in the 1970s. In 2024, the Aviation Safety Network recorded 217 aviation occurrences resulting in 404 fatalities, reflecting a slight uptick in incidents amid recovering global traffic volumes post-pandemic. The ICAO Safety Report 2025 documented 95 accidents worldwide in 2024, with 296 fatalities, an increase from 66 accidents and 72 fatalities in 2023. Accidents involving pilot error are disproportionately concentrated in certain flight phases, with approximately 50% occurring during takeoff, initial climb, and landing according to Boeing's analysis of 2015-2024 fatal accidents (13% takeoff, 37% landing). Globally, ICAO's 2025 Safety Report notes significant regional variations in accident rates, with higher occurrences in developing areas like the Asia-Pacific due to infrastructure challenges, though exact figures vary by subregion.

Causes of Pilot Error

Threats

In the Threat and Error Management (TEM) model, threats are defined as external events or conditions that occur beyond the direct control of flight crews, which increase the complexity of flight operations and require proactive management to maintain safety margins. These threats can arise from environmental factors, such as adverse weather, challenging terrain, or bird strikes; technical issues, including equipment malfunctions or (ATC) communications problems; and operational challenges, like high workload during peak traffic periods or schedules that induce . Threats in are categorized into latent and types. Latent threats are systemic and often not immediately apparent, stemming from organizational or design flaws such as poor layouts or inadequate design that can erode margins over time. In contrast, threats are immediate and observable, like unexpected or sudden ATC delays, demanding real-time responses from pilots. An emerging example of a threat in 2025 is Global Navigation Satellite System (GNSS) radio frequency interference (RFI), including and spoofing, which disrupts navigation signals and poses risks to positioning, as highlighted by the (ICAO). These threats heighten pilots' cognitive load by demanding additional attention and resources, thereby increasing vulnerability to subsequent errors if not addressed. Research indicates that unmanaged threats contribute to a substantial portion of aviation accidents, with human factors related to threat management implicated in approximately 70-80% of incidents overall. Historically, early aviation threats in the often involved unreliable instruments, such as rudimentary altimeters and compasses prone to failure in poor visibility, leading to and a high rate of accidents—around 40 fatal incidents per million departures in the U.S. that decade. In modern contexts, cyber threats to systems represent a growing concern, with potential vulnerabilities in digital flight controls and software that could be exploited to cause system disruptions or false data inputs.

Errors

Pilot errors in aviation are typically classified using human factors models such as Rasmussen's Skill-Rule-Knowledge (SRK) framework, which categorizes errors based on the level of cognitive processing involved. Skill-based errors occur during routine, automatic actions and include slips (unintended actions, such as activating the wrong switch due to inattention) and lapses (failures in memory or attention, like forgetting to complete a checklist step). Rule-based mistakes arise when pilots apply the incorrect procedure or misinterpret a situation, often in familiar scenarios requiring adherence to standard operating procedures. Knowledge-based errors happen in novel or unfamiliar situations, stemming from flawed mental models or incomplete understanding, leading to incorrect problem-solving. Common mechanisms contributing to these errors include , which impairs attention and reaction times and accounts for up to 20% of accidents according to research. Workload overload can overwhelm cognitive capacity, causing slips during high-demand phases like takeoff or , while poor communication between crew members often leads to rule-based mistakes through misunderstandings of instructions. Perceptual errors, such as , result from sensory illusions that mislead pilots about the aircraft's attitude or position relative to the horizon, contributing to 5-10% of accidents. Line Operations Safety Audit (LOSA) data reveal that pilot errors occur on nearly every flight, with an average of two errors observed per flight, but they are successfully managed in over 99% of cases through countermeasures, resulting in undesired states (UAS)—such as stalls or deviations—that affect less than 1% of flights. Unmanaged errors can escalate to safety risks, but the high management rate underscores the effectiveness of crew vigilance in normal operations. Psychological factors exacerbate error susceptibility, including cognitive biases like , where pilots selectively interpret information to affirm preconceived notions, such as misidentifying a navigation waypoint. Stress responses, triggered by unexpected events, can induce physiological arousal (e.g., elevated ) and channelized , narrowing focus and increasing the likelihood of skill-based slips or knowledge-based errors. These factors highlight the need for targeted to mitigate internal vulnerabilities in pilot performance.

Decision Making

Decision making in pilot error refers to higher-level cognitive failures where pilots exhibit flawed judgment and , often under , leading to strategic missteps rather than procedural lapses. This process encompasses situation assessment, of available options, and selection of a course of action in dynamic environments characterized by time pressure and incomplete information. Failures in these areas can manifest as persistent commitment to an original plan despite emerging risks, or narrow focus on a single problem at the expense of broader . A prominent example of such is "get-there-itis," a hazardous where pilots prioritize reaching their destination over , often driven by schedule pressures or personal commitments, resulting in continued flight into deteriorating conditions. Similarly, fixation, or , occurs when pilots become overly absorbed in one aspect of the flight—such as a minor issue—neglecting critical cues elsewhere, which degrades overall performance and . These cognitive traps are exacerbated in high-stress scenarios, where pilots may overlook alternative options or fail to adapt to changing conditions. The (NDM) framework provides insight into how pilots make these judgments, emphasizing intuitive, experience-based strategies like Recognition-Primed Decision Making (RPD), where experts rapidly assess situations using from past flights and mentally simulate options. While experience enables faster and more effective decisions in familiar contexts, it can also introduce biases, such as over-reliance on familiar templates that do not match the current scenario, leading to incomplete information processing. Studies indicate that poor contributes significantly to errors, with pilots sometimes underestimating hazards due to optimistic assessments; for instance, FAA analyses link inadequate to a substantial portion of accidents. Contributing factors include overconfidence, where pilots overestimate their ability to handle uncertainties, leading to dismissal of warnings or incomplete evaluation of threats like or mechanical issues. In multi-crew settings, can further impair judgment, as crew members suppress dissenting views to maintain harmony, resulting in unchallenged flawed decisions, as seen in investigations of fuel exhaustion incidents where adaptations to procedures went unquestioned. Incomplete information processing, often tied to , compounds these issues by causing pilots to selectively interpret data that supports continuation rather than deviation. Such decision errors frequently escalate to undesired aircraft states (UAS), such as deviations in , speed, or that compromise safety margins, potentially leading to loss of or terrain collision. Research shows that poor aeronautical accounts for more than half of fatal pilot error accidents in , which constitute the majority of fatal accidents.

Threat and Error Management (TEM)

Overview of TEM

Threat and Error Management (TEM) is a proactive safety framework in that emphasizes the systematic identification, mitigation, and recovery from operational risks to maintain safety margins during flight operations. Developed in the mid-1990s through collaborative efforts between the , led by researchers such as Captain Robert Helmreich, with funding from the (FAA), and major airlines such as , TEM provides a structured model for understanding how external challenges and human actions interact to affect flight safety. The core TEM model posits that pilots and crews actively detect threats and errors, respond appropriately, and recover to prevent deviations from safe flight paths, thereby transforming potential hazards into managed elements of routine operations. The TEM framework comprises three primary components: threats, errors, and undesired aircraft states (UAS). Threats are external events or conditions beyond the crew's direct control that increase operational complexity, such as adverse or instructions. Errors refer to crew-induced actions or inactions that deviate from crew intentions or standard operating procedures, potentially compromising . UAS represent intermediate states where safety margins are reduced, such as unintended altitude excursions, but which can be corrected before escalating to incidents or accidents. Success in TEM is achieved through effective error trapping and , with line operations data indicating that crews successfully manage approximately 95% of encountered threats and errors to preserve . As the fifth generation of Crew Resource Management (CRM) training, TEM evolved in the late 1990s and became integrated into pilot training programs worldwide during the 2000s, shifting focus from error avoidance to inevitable error management within a systems approach. By 2025, advancements in TEM include the incorporation of artificial intelligence (AI) for real-time threat prediction within flight management systems, enabling automated alerts for potential risks like system anomalies or environmental hazards. The benefits of TEM adoption are evident in reduced operational risks, with airlines implementing TEM-based programs through tools like the Line Operations Safety Audit (LOSA) reporting significant improvements in error management and overall safety performance, including up to 70% reductions in specific error-related metrics in follow-up audits. TEM's foundational assumption—that errors are inevitable but manageable—has contributed to aviation's low accident rates by prioritizing resilience over perfection.

Line Operations Safety Audit (LOSA)

The Line Operations Safety Audit (LOSA) is a voluntary, non-punitive observational program designed to collect data on flight crew performance and safety risks during normal commercial flight operations. Developed in the mid-1990s by the University of Texas at Austin's Human Factors Research Project in collaboration with the (FAA) and , with the first LOSA conducted in 1996, LOSA employs trained observers—typically experienced pilots—who ride in the to document threats, errors, and undesired aircraft states (UAS) without influencing crew actions. The process emphasizes confidentiality and peer-to-peer observation, with audits typically spanning several months and involving hundreds to thousands of flights per cycle to ensure representative sampling across routes, times, and conditions. This approach builds on (TEM) principles by capturing real-time operational realities rather than simulated scenarios. Key findings from LOSA audits reveal systemic vulnerabilities, such as latent organizational threats like scheduling pressures that contribute to and errors in up to 20% of observed cases, alongside environmental factors like adverse . Adopted by more than 100 globally since its endorsement by the (ICAO) in 1999, LOSA has driven targeted interventions that enhance safety margins; for instance, one early implementation reduced checklist deviations by 40% and unstabilized approaches by 62% through revised and procedures. These outcomes underscore LOSA's role in identifying both inconsequential errors (observed in 85% of flights, with most successfully managed) and higher-risk patterns that could escalate without mitigation. The methodology centers on standardized, anonymous using LOSA forms that code observations via the , categorizing threats (e.g., external factors), errors (e.g., procedural lapses), and UAS (e.g., deviations from safe flight paths), followed by aggregate analysis to prioritize countermeasures. As of 2025, updates incorporate digital tools, including FAA-approved software for entry, automated coding, and , which streamline processing and enable quicker feedback loops compared to traditional paper-based systems. LOSA's proactive nature distinguishes it from reactive accident investigations, as it focuses on everyday operations to preempt incidents by addressing root causes early. Notable outcomes include procedure revisions, such as enhanced stabilized approach criteria at multiple operators, which have reduced approach-and-landing risks by improving rates from 15% to over 50% in follow-up audits. Tens of thousands of observations worldwide as of 2025 have yielded no disciplinary actions, reinforcing trust and participation while contributing to broader industry safety enhancements.

Crew Resource Management (CRM)

Crew Resource Management (CRM) originated in the aviation industry following the 1977 Tenerife airport disaster, which highlighted critical failures in crew communication and coordination as contributing factors to the accident. In response, NASA sponsored a pivotal workshop in 1979 titled "Resource Management on the Flightdeck," which laid the groundwork for CRM as a training program to improve non-technical skills among flight crews. This initiative evolved through several generations, starting with first-generation CRM in the early 1980s, which focused on individual behaviors and psychological aspects drawn from management training. Subsequent generations expanded to emphasize teamwork, situational awareness, and error prevention, culminating in the fifth generation by the early 2000s, which integrated elements of Threat and Error Management (TEM) to address systemic risks proactively. By the 1990s, CRM became mandatory for commercial aviation operators under regulations from the Federal Aviation Administration (FAA) and the International Civil Aviation Organization (ICAO), requiring initial, recurrent, and upgrade training for all flight crew members. The core elements of CRM training center on enhancing interpersonal and team-based skills to mitigate pilot errors, including effective communication through techniques like to encourage open dialogue and challenge unsafe decisions, leadership and followership to balance authority gradients in the cockpit, and workload management to prioritize tasks under pressure. These components target crew coordination failures, which contribute to approximately 70-80% of incidents involving factors, as poor teamwork often amplifies individual errors into safety threats. training is primarily delivered through simulator-based scenarios, such as Line-Oriented Flight Training (), where crews practice realistic flight operations to apply these skills in dynamic environments, fostering better decision-making and error detection. Studies, including those from the University of Human Factors Research Project, have demonstrated CRM's effectiveness, with data indicating a significant decline in crew-related accidents from the onward, aligning with a broader 50% reduction in such incidents over the 1990-2020 period as became widespread. As of 2025, advancements in CRM incorporate virtual reality (VR) simulations to replicate stress-induced decision errors more immersively, allowing crews to experience high-pressure scenarios that traditional simulators may not fully capture, thereby improving responses to fatigue and cognitive overload. These VR tools, enhanced by AI for adaptive feedback, have shown promise in reducing human error rates in controlled studies by up to 30% through targeted practice. However, challenges persist, particularly cultural barriers in diverse multinational crews, where differing norms around hierarchy and direct communication can hinder assertiveness and cohesion, necessitating tailored training modules to bridge these gaps. Data from Line Operations Safety Audits (LOSA) further validate CRM's role by observing real-world applications that correlate with lower error rates in coordinated teams.

Cockpit Task Management (CTM)

Cockpit Task Management (CTM) emerged as a critical component of frameworks in the 1990s and became integrated into (TEM) approaches during the early 2000s, emphasizing pilots' ability to handle multiple concurrent tasks under varying workloads. It encompasses the initiation, monitoring, prioritization, resource allocation, and termination of cockpit tasks, particularly during high-workload phases such as descent or approach, where pilots must queue non-urgent activities, delegate responsibilities, and avoid error-prone multitasking. By focusing on procedural task flow, CTM complements crew coordination strategies, enabling individual pilots to maintain focus on primary flight duties while addressing secondary demands like system reconfiguration. Key techniques in CTM include the Queue (TMQ) model, which conceptualizes tasks as a dynamic based on urgency and resource availability, drawing from early human factors research on multitasking. This approach directly counters interruptions, which contribute to approximately 25% of CTM-related errors identified in analyses of (NTSB) accident reports from 1960 to 1989. Prioritization rules, such as "aviate, navigate, communicate," guide pilots in sequencing tasks, while delegation leverages crew roles to distribute workload without compromising oversight. In practice, CTM principles are embedded within (CRM) training programs, where pilots learn to apply them through scenario-based simulations. Practical tools include strict limits on heads-down time—typically no more than 10-15 seconds during critical phases—to minimize distraction from visual flight path monitoring, as recommended in Aviation Safety Reporting System (ASRS) guidelines. As of 2025, emerging technologies such as AI-powered cockpit assistants, like Skyryse's Skylar system, provide real-time task alerts and prioritization cues, integrating with flight management systems to automate routine queuing and reduce cognitive overload. Studies demonstrate that effective CTM implementation enhances error detection and mitigation; for instance, simulator-based training on task prioritization has been shown to reduce related errors by up to 54% among novice pilots. A representative application involves managing (FMS) data entry during descent: by queuing non-essential programming until workload eases, pilots prevent navigation deviations, as evidenced in line operations observations where poor FMS handling contributed to 15-20% of approach-phase incidents. Overall, CTM addresses task-handling deficiencies that underlie a significant portion of pilot errors, fostering safer operations through disciplined ..pdf)

Checklists

Checklists emerged as a fundamental safeguard in aviation following the 1935 crash of the Boeing Model 299 prototype during a U.S. Army Air Corps evaluation flight at Wright Field, where pilot error in retracting the flaps too early—amid the aircraft's increasing complexity—led to a stall and fatal accident, killing two crew members. This incident prompted Boeing engineers, led by test pilot Major Ployer Hill, to develop the first modern pre-flight checklist as a simple, standardized tool to ensure all critical steps were verified, transforming aviation practices worldwide by emphasizing procedural discipline over reliance on memory alone. In their role within pilot error prevention, checklists act as cognitive aids that mitigate oversight, procedural lapses, and distractions during high-workload phases, serving as a core component of threat and error management by enabling systematic error trapping and recovery. Aviation checklists are classified into three primary types based on operational context: normal checklists for routine procedures, abnormal checklists for non-standard but non-emergency situations (such as system malfunctions), and emergency checklists for immediate threats to flight safety. checklists typically employ a "do-confirm" or challenge-response format, where the pilot flying executes actions from memory guided by a cockpit flow pattern, and the pilot monitoring challenges each item aloud for verbal confirmation of completion, thereby distributing workload and enhancing cross-verification. In contrast, abnormal and checklists use a "read-do" format, where items are read sequentially and performed step-by-step, often with built-in preconditions and crew agreement to ensure precise execution under stress, as these scenarios demand explicit guidance to avoid failures. These formats are bundled in quick reference handbooks (QRH) for rapid access, with electronic versions increasingly auto-sensing completed items through color changes or sensors to reduce errors. Design principles for checklists prioritize human factors to minimize and maximize reliability, with the challenge-response format allowing crews to maintain by confirming rather than interrupting flows with constant reading. International standards from the (ICAO) emphasize logical grouping by , sequential alignment with layouts, and concise phrasing to facilitate quick completion, ensuring checklists support rather than hinder in dynamic environments. Critical items, such as flap settings or checks, are positioned early to prioritize , while vague responses like "as required" are avoided in favor of specific verifications to prevent ambiguity. NASA research underscores their effectiveness as primary defenses against pilot errors and equipment issues, with proper use enabling crews to trap deviations in control and procedures, though lapses in monitoring can undermine this when multitasking or fatigue intervenes. Advancements by 2025 have integrated digital checklists with voice activation and automation, allowing hands-free interaction via and to call out items, confirm responses, and even automate non-critical tasks, thereby reducing pilot workload in business and . Despite their proven value, limitations arise from "checklist complacency," where over-familiarity leads to skipped items or rote recitation without true verification, potentially exacerbating errors in complex scenarios. Best practices to counter this include adherence to the sterile cockpit rule, which mandates a distraction-free —prohibiting non-essential conversation or activities during critical phases like takeoff, , or checklist execution—to maintain focus and ensure thorough engagement.

Notable Examples

Historical Incidents

In December 1972, , a , crashed into the Florida Everglades while troubleshooting a minor issue with the nose landing gear indicator light. The flight crew became fixated on the malfunctioning bulb, inadvertently disengaging the autopilot's altitude hold function and allowing the aircraft to descend unnoticed from 2,000 feet until it impacted the swamp, killing 99 of the 176 people on board. The (NTSB) determined the primary cause as the crew's failure to monitor due to and complacency with , marking an early illustration of how overreliance on technology can lead to critical oversights. The on March 27, 1977, remains the deadliest aviation accident in history, involving a collision between Flight 4805 and Flight 1736, two 747s, on the at Los Rodeos Airport amid dense fog from a nearby storm. Miscommunication arose when the captain initiated takeoff without full clearance, interpreting an ambiguous response from as approval, while the crew was still taxiing on the active ; this decision error, compounded by non-standard and the flight engineer's hesitation to challenge the captain, resulted in 583 fatalities—all 248 on the jet and 335 on the aircraft. The Dutch Safety Board report emphasized how hierarchical dynamics and unclear radio exchanges contributed to the tragedy, directly spurring the widespread adoption of () training to improve communication and assertiveness. A similar pattern of fixation emerged in the crash of on December 28, 1978, a McDonnell Douglas DC-8-61 that ran out of fuel while circling due to concerns over the . The captain's preoccupation with verifying the gear's extension prevented effective monitoring of the fuel state, despite warnings from the and first officer, leading to all four engines flaming out at 5,000 feet and the aircraft crashing 6 miles short of the runway, with 10 fatalities among the 189 on board. The NTSB report highlighted this as a failure in crew coordination and , serving as a precursor to (TEM) frameworks by demonstrating the need for better task prioritization and crew intervention protocols. In 1989, , a 737-200, suffered a catastrophic error during a domestic flight from to , Brazil. The captain misprogrammed the by entering a heading of 270° instead of the intended 027.0°, causing the aircraft to veer westward into the rather than northward; compounded by inadequate from the first officer and failure to verify position against radio aids or visual cues like the sun's position, the crew exhausted their fuel supply after three hours, forcing a crash landing that killed 12 of the 54 people on board. Brazil's Centro de Investigação e Prevenção de Acidentes Aeronáuticos (CENIPA) report attributed the incident to poor in navigation programming and overconfidence in automated systems without sufficient verification, reinforcing the importance of rigorous training in error detection and contingency planning.

Modern Cases

In the years following the widespread adoption of () and () frameworks, several high-profile accidents have underscored ongoing challenges in pilot , automation reliance, and human factors, even as these interventions have reduced overall error rates. For instance, the crash of in 1997, though predating full CRM implementation at the airline, became a pivotal case in post-CRM era analyses that demonstrated the value of tools like Line Operations Safety Audits (LOSA) in identifying and mitigating approach errors. The Boeing 747-300 struck Nimitz Hill near Guam International Airport during a non-precision in poor weather, killing 228 of 254 aboard due to the crew's failure to monitor altitude adequately amid fatigue and communication breakdowns. Subsequent LOSA audits at revealed persistent threats in crew coordination, leading to enhanced training that contributed to the airline's dramatic safety improvements in the 2000s. The 2009 crash of highlighted gaps in high-altitude stall recovery training despite CRM advancements. An en route from to stalled over Ocean after pitot tubes iced over, causing unreliable airspeed indications that prompted inappropriate pilot inputs, including sustained nose-up commands that deepened the . All 228 occupants perished, and the French Bureau of Enquiry and Analysis for Civil Aviation Safety (BEA) final report attributed the accident primarily to the crew's failure to recognize the and apply recovery procedures, exacerbated by inadequate simulator training for such scenarios. This incident prompted global regulatory updates to TEM protocols, emphasizing surprise and startle responses in high-workload environments. Automation over-reliance and Cockpit Task Management (CTM) deficiencies were central to the 2013 accident. The 777-200ER struck a short of the runway at during a , resulting in three fatalities and injuring dozens among the 307 aboard. The (NTSB) determined that the flight crew mismanaged the approach by deactivating without configuring it properly, leading to a low-speed stall; contributing factors included the captain's lack of recent manual flight experience and the first officer's failure to monitor airspeed or call out deviations, violating standard checklist procedures. This case illustrated how TEM recovery mechanisms faltered under automation dependency, spurring airlines to refine CTM training for low-visibility landings. As of November 2025, NTSB data records over 1,200 accidents in the United States for the year, with accounting for the vast majority. Pilot error continues to be the leading cause, contributing to approximately 70 percent of accidents, including factors such as loss of control and . Fatigue has been identified as a contributing element in several fatal incidents, emphasizing the persistent need for aeronautical decision-making training in non-commercial operations despite advancements in and TEM.

References

  1. [1]
    [PDF] Chapter 13: Effective Aeronautical Decision-Making
    Pilot error means an action or decision made by the pilot was the cause of ... Therefore, a poor decision early in a flight can compromise the safety of the ...Missing: authoritative | Show results with:authoritative
  2. [2]
    Calendar Year 2022 Accident Conditions - AOPA
    The percentage of pilot-related accidents remained around 70 percent (figure 1.4). Pilot-related accidents consisted of 708 total of which 117 accidents were ...
  3. [3]
    Pilot Error in Air Carrier Mishaps: Longitudinal Trends Among 558 ...
    The overall mishap rate remained fairly stable, but the proportion of mishaps involving pilot error decreased from 42% in 1983–87 to 25% in 1998–2002, a 40% ...
  4. [4]
    [PDF] Human Error and Commercial Aviation Accidents
    This study uses HFACS to analyze human error in aviation accidents, finding most causal factors are aircrew and environmental, with fewer supervisory and ...
  5. [5]
    Human Error Types | SKYbrary Aviation Safety
    Human Error Types · Definition · Description · Slips and Lapses · Mistakes · Error frequencies · Error detection and correction · Related Articles · Further Reading.
  6. [6]
    [PDF] A Layman's Introduction to Human Factors in Aircraft Accident and ...
    Jun 24, 2006 · The focus of Human Factors on the pilot, led to the use of the term 'Pilot Error' which, from a cultural perspective, carries a link to our ...
  7. [7]
    Aviation and Plane Crash Statistics | Updated 2025
    Pilot error is thought to account for 53% of aircraft accidents, with mechanical failure (21%) and weather conditions (11%) following behind. Even in the ...
  8. [8]
    Pilot Error in Aviation Accidents: The Cause Behind Recent U.S. ...
    Mar 3, 2025 · According to aviation safety statistics 2025 for US airlines, pilot error is responsible for nearly 70% of all aviation accidents.Missing: definition | Show results with:definition
  9. [9]
    Human Factors: Addressing Human Error, Fatigue, and Crew ...
    Jun 10, 2025 · According to the International Civil Aviation Organization (ICAO), human error contributes to approximately 70-80% of aviation accidents.
  10. [10]
    [PDF] Statistical Summary of Commercial Jet Airplane Accidents - Boeing
    In 2024, there were 47 accidents, 12 hull losses, and 187 fatalities. The industry has seen a 40% decline in total accident rate over the past two decades.
  11. [11]
    2024 - Aviation Safety Network
    2024. 217 occurrences in the ASN safety database 404 fatalities. >>Add an accident. Filter by country, All countries, Afghanistan (2), Algeria (1), Argentina (3) ...
  12. [12]
    78 Lives Lost: Examining the Surge in 2025 Plane Crashes
    Feb 10, 2025 · It's too soon to know if the early 2025 spike in plane crashes is an outlier or indicative of systemic issues in the aviation industry.
  13. [13]
    Fear of flying? Here's the data on 2025 plane accidents vs. 2024
    Feb 26, 2025 · In comparison, air travel has 0.01 deaths per 100 million miles traveled. Pilot error is believed to account for 53% of aircraft accidents.<|separator|>
  14. [14]
    Analyzing Aviation Safety: Key Insights from Recent Reports
    Dec 2, 2024 · Boeing's Report (2024): The statistic that "over 30% of fatal accidents occur during takeoff and landing" is directly from Boeing's report, as ...
  15. [15]
    A new scoring system to predict fatal accidents in General Aviation ...
    Nov 14, 2024 · The flight phase could be evaluated in 284 out of 285 reports. In total, 79 accidents (27.8%) occurred during landing which was survived in 64/ ...
  16. [16]
    [PDF] State of Global Aviation Safety - ICAO
    Aug 11, 2025 · 1997 ICAO introduced the first version of the Global Aviation Safety Plan (GASP) that sets out the strategic planning and implementation policy ...
  17. [17]
    ICAO 2025 Report Exposes Alarming Rise in Global Aviation Safety ...
    Aug 12, 2025 · Regional Safety Performance Analysis. Regional analysis reveals significant variations in safety performance across different parts of the world ...Missing: error | Show results with:error<|separator|>
  18. [18]
    Threat and Error Management (TEM) in Flight Operations - SKYbrary
    Threats are defined as “events or errors that occur beyond the influence of the flight crew, increase operational complexity, and which must be managed to ...
  19. [19]
    Introduction to Threat and Error Management - Transports Canada
    Oct 6, 2025 · The TEM model considers these complexities as threats because they all have the potential to negatively affect flight operations by reducing ...
  20. [20]
    [PDF] Threat and error management (TEM) awareness material - CAA
    TEM is the process of detecting and responding to threats and errors to prevent safety being compromised, and is an extension to airmanship.
  21. [21]
    [PDF] NACC/WG/10 — WP/44Rev - ICAO
    Aug 29, 2025 · This document provides a structured approach to assess the GNSS Radio Frequency. Interference (RFI) safety issue, proposes actions for airlines, ...
  22. [22]
  23. [23]
    AVIATION CYBERSECURITY - ICAO
    The world has been witnessing a steady increase in cyber-attacks against all sectors. Aviation has been no exception, being characterized by its extensive ...
  24. [24]
    [PDF] Technical Review of Human Performance Models and Taxonomies ...
    Apr 26, 2002 · Error mechanisms can be divided according to Rasmussen's SRK framework, as follows: 1. Skill-based - Manual variability, topographic ...
  25. [25]
    [PDF] Decision Making - EASA
    2.1 The Rasmussen's SRK model. 2.2 The Recognition Primed Decision-Making ... Decision errors in aviation are typically not slips or lapses but mistakes ...
  26. [26]
    Business Aviation Insider: Understanding the Risks of Fatigue - NBAA
    May 13, 2019 · NASA reports that up to 20 percent of aviation accidents are caused by fatigue. The NTSB finds that fatigue contributes to 88 percent of ...
  27. [27]
    [PDF] Line Operations Safety Audit (LOSA) Provides Data on Threats and ...
    Feb 2, 2005 · A communication error was committed when the pilot not flying told the pilot flying to taxi onto the wrong runway. The pilot flying. Page 7. 5.
  28. [28]
    [PDF] Spatial Disorientation: Visual Illusions
    Statistics show that between 5 to. 10% of all general aviation accidents can be attributed to spatial disorientation, and 90% of these accidents are fatal.Missing: errors | Show results with:errors
  29. [29]
    On error management: lessons from aviation - PMC - NIH
    Research by the National Aeronautics and Space Administration into aviation accidents has found that 70% involve human error. In contrast, medical adverse ...
  30. [30]
    [PDF] Defensive Flying for Pilots: An Introduction to Threat and Error ...
    Dec 12, 2006 · Line Operations Safety Audits (LOSA): Considered a best practice for normal operations monitoring and aviation safety by both ICAO and the FAA, ...
  31. [31]
    Confirmation Bias | SKYbrary Aviation Safety
    Having expectations frequently confirmed reduces the sensitivity of the error detection mechanism. Confirmation bias is a selective process that favours ...
  32. [32]
    Stress and Stress Management (OGHFA BN) - Skybrary
    The objective of this Briefing Note (BN) is to give an overview of what stress is and to provide best practices to help flight crews.
  33. [33]
    [PDF] Stressed Out - Flight Safety Foundation
    in which a pilot or an entire aircrew under high stress becomes focused on one stimulus, such ...
  34. [34]
    [PDF] Aeronautical Decision-Making - Federal Aviation Administration
    These are called human factors and can transcend education, experience, health, physiological aspects, etc. Another example of risk assessment was the flight ...
  35. [35]
    [PDF] Spatial Disorientation - FAA Safety
    Spatial disorientation claimed the lives of 113 pilots and passengers in IMC during the last decade. Investigators found evidence of vacuum system and/or ...
  36. [36]
    [PDF] Naturalistic Decision Making in Aviation Environments - DTIC
    Naturalistic Decision Making (NDM) is an intuitive strategy used by experienced pilots in dynamic, high-risk environments, focusing on situation assessment and ...
  37. [37]
    [PDF] Risk Management Handbook - Federal Aviation Administration
    The NTSB determines the probable cause(s) of accidents and makes recommendations, while the FAA determines if accidents reveal deficiencies in pilot training, ...<|control11|><|separator|>
  38. [38]
    Cognitive Bias | SKYbrary Aviation Safety
    Cognitive biases are intellectual shortcuts that streamline decision-making, but can lead to illogical decisions and misjudgement of risks.
  39. [39]
    [PDF] Errors in Aviation Decision Making: Bad Decisions or Bad Luck?
    Common aviation decision errors include plan-continuation errors, often due to ambiguous conditions, goal conflicts, and underestimation of risk. NTSB ...
  40. [40]
    AINsight: Adaptations, Groupthink Down Twin Otter
    Dec 22, 2022 · Canada TSB drills down into group dynamics to find root cause of fuel starvation that led to a crash of a de Havilland Twin Otter.<|separator|>
  41. [41]
    Poor Judgment - AOPA
    Jul 5, 2000 · 90 percent of all general aviation accidents are attributed to pilot error. Worse, poor pilot decision making accounts for more than half of the fatal pilot ...
  42. [42]
    History | Federal Aviation Administration
    Jun 13, 2023 · In 1997, the UT-Austin team collaborated with Continental Airlines to expand the method to focus on the management of regular threats and errors ...
  43. [43]
    Threat and Error Management (TEM) | SKYbrary Aviation Safety
    Threats - generally defined as events or errors that occur beyond the influence of the line personnel, increase operational complexity, and which must be ...
  44. [44]
    [PDF] The Principles of Threat and Error Management (TEM) for Helicopter ...
    The TEM model considers 3 categories of threats, anticipated, unanticipated and latent which all have the potential to negatively affect flight operations by ...<|separator|>
  45. [45]
    [PDF] Line Operations Safety Audit (LOSA)
    The audits are conducted under strict no-jeopardy conditions; therefore, flight crews are not held accountable for their actions and errors that are observed.
  46. [46]
    [PDF] SMS and CRM: Parallels and Opposites in their Evolution
    TEM made its way into CRM by the end of the 1990s. Threat and error management (TEM) is a systems approach to aviation safety originally developed by human ...
  47. [47]
    Aviation Safety and Communication: Why AI and IoT Are Critical
    Feb 19, 2025 · The integration of Artificial Intelligence (AI) and the Internet of Things (IoT) is setting new standards for flight safety and communication systems.
  48. [48]
    Lessons from the implementation of Threat and Error Management
    Threat and Error Management (TEM) is a safety management approach by proactively detecting, appropriately responding to and managing threats and errors that are ...
  49. [49]
    Line Operations Safety Audit (LOSA) - SKYbrary
    LOSA is a structured program to develop countermeasures to operational errors, using trained observers to collect data on pilot behavior.
  50. [50]
    Line Operations Safety Assessments (LOSA) | Federal Aviation ...
    Jun 13, 2023 · LOSA addresses aviation safety proactively, using peer-to-peer observations, anonymous data collection, and voluntary participation.History · Benefits and Characteristics of... · Training, Forms and Software
  51. [51]
    [PDF] The Evolution of Crew Resource Management Training in ...
    The Federal Aviation Administration introduced a major change in the training and qualification of flight crews in 1990 with the initiation of its Advanced ...Missing: TEM | Show results with:TEM
  52. [52]
    The Evolution of CRM - Just Helicopters
    Jul 18, 2016 · Fifth-generation CRM programs also include instruction on human performance limitations, and focus on providing strategies to effectively avoid, ...
  53. [53]
    [PDF] AC 120-51D - Crew Resource Management Training
    Feb 8, 2001 · All part 121 operators are required by regulations to provide CRM training for pilots and flight attendants, and dispatch resource management ( ...
  54. [54]
    Crew Resource Management (CRM) | SKYbrary Aviation Safety
    (ICAO Annex 6 Part 1 Chapter 9 Para 9.3.1). All flight crew members are required to complete CRM training at various stages of their careers, including ...
  55. [55]
    A framework for ageing and health vulnerabilities in a changing ...
    Sep 10, 2025 · The global population is experiencing historically unprecedented aging, simultaneously with an increase in the intensity and frequency of ...
  56. [56]
    [PDF] Aviation Computer Games for Crew Resource Management Training
    Develop scenarios that focus specifically on CRM skills and require minimal technical expertise. PC-based systems and software are unlikely to model accurately.
  57. [57]
    Reducing Human Error in Aviation: The Impact of Advanced Training ...
    Aug 2, 2025 · This study evaluates the effectiveness of advanced training programs in reducing human error within commercial aviation, focusing on the ...Missing: induced | Show results with:induced
  58. [58]
    AI-Enhanced Flight Simulation and Virtual Reality Training - LinkedIn
    May 15, 2025 · This article provides an integrative review and critical analysis of AI-enhanced flight simulation and VR training, emphasizing their role in improving ...Missing: advancements induced
  59. [59]
    [PDF] Crew Resource Management (CRM) and Cultural Differences ...
    CRM 1: Language differences between cockpit crews from different countries may cause a threat to safety. CRM 2: There is less cohesiveness among cockpit ...
  60. [60]
    Cockpit task management: A preliminary, normative theory
    Cockpit task management (CTM) involves the initiation, monitoring, prioritizing, and allocation of resources to concurrent tasks as well as termination of ...
  61. [61]
    Cockpit Task Management: Preliminary Definitions, Normative ...
    A preliminary formalization of the process that flight crews use to initiate, monitor, prioritize, execute, and terminate multiple, concurrent tasks is ...
  62. [62]
    [PDF] cockpit task management errors
    A CTM error in the context of the above framework is an error which degrades the effectiveness of task management. For example, to begin an inflight engine ...
  63. [63]
    Threat Error Management - Code7700
    Oct 6, 2019 · At the start of the 21st century, the sixth generation of CRM was formed, which introduced the Threat and Error Management (TEM) framework as a ...
  64. [64]
    Cockpit Interruptions and Distractions (ASRS Directline# 10)
    If possible, reschedule head-down tasks to low workload periods. Announce that you are going head-down. In some situations it may be useful to go to a lower ...
  65. [65]
    Skyryse Unveils Skylar™, a Universal AI Flight Assistant Focused on ...
    Sep 29, 2025 · Skylar is an AI flight assistant integrated into SkyOS, an aircraft-agnostic, hardware and software system that gives pilots greater control by ...Missing: task | Show results with:task
  66. [66]
    [PDF] Experimental Analysis of Task Prioritization Training for a Group of ...
    Comparison of pre- and posttest error rates shows the experimental group had a 54% decrease in task prioritization errors and the control group had a 9% ...
  67. [67]
    [PDF] Cockpit Interruptions and Distractions: Effective Management ...
    In 35 of the ASRS incidents we studied, the pilot not flying reported that preoccupation with other duties prevented monitoring the other pilot closely enough.
  68. [68]
    Did you know the pre-flight checklist was first introduced by Boeing ...
    The pre-flight checklist was first introduced by Boeing following the 1935 crash of the prototype B-17 (then known as the Model 299).
  69. [69]
    Checklists and Monitoring in the Cockpit: Why Crucial Defenses ...
    Jul 1, 2010 · This study was conducted to explore why checklists and monitoring sometimes fail to catch errors and equipment malfunctions as intended.Missing: trapping | Show results with:trapping
  70. [70]
    Checklists - Purpose and Use | SKYbrary Aviation Safety
    Read-and-do lists usually relate to non-normal (abnormal and emergency) procedures for which a cockpit flow pattern performed from memory is not suitable.
  71. [71]
    [PDF] COCKPIT CHECKLISTS: CONCEPTS, DESIGN, AND USE
    There are two dominant methods of conducting (“running”) a checklist—the do-list and the challenge-response. Each is the product of a different operational ...
  72. [72]
    Emergency and Abnormal Checklist | SKYbrary Aviation Safety
    An Emergency and Abnormal Checklist (EAC) is a handbook with initial response checklists for emergency and abnormal procedures, containing emergency and  ...
  73. [73]
    [PDF] Doc 9756 - ICAO
    Jun 9, 2024 · The purpose of this manual is to encourage the uniform application of the Standards and Recommended Practices.
  74. [74]
    Addressing business aviation tech needs
    Aug 5, 2025 · Voice activation refers to automatic speech recognition (ASR) and natural language processing (NLP) allowing pilots increasingly to interact ...
  75. [75]
    [PDF] HUMAN PERFORMANCE CONSIDERATIONS IN THE USE AND ...
    This report is intended to assist Part 121 and Part 135, operators in the design, development, and use of aircraft flightdeck checklists, and to increase ...Missing: trapping | Show results with:trapping<|separator|>
  76. [76]
    The sterile cockpit rule - NASA ASRS
    Commonly known as the "sterile cockpit rule," these regulations specifically prohibit crew member performance of non-essential duties or activities.Missing: checklist complacency
  77. [77]
    1954 BOAC Lockheed Constellation crash - Wikipedia
    A British Overseas Airways Corporation (BOAC) Lockheed L-749A Constellation crashed and caught fire as it attempted to land at Kallang Airport on 13 March 1954
  78. [78]
    Accident Lockheed L-1049C Super Constellation F-BGNA, Tuesday ...
    The flight diverted to Boston due to weather, then belly landed due to fuel exhaustion. The cause was inadequate in-flight planning.
  79. [79]
    [PDF] II iiw- -___ ._..
    Abstract. An Eastern Air Lines Lockheed L-1011 crashed at 2342 eastern standard time, December 29, 1972, 18.7 miles west-northwest. of Miami Inter'-.
  80. [80]
    [PDF] AAR-73-14 - NTSB
    Jun 14, 1973 · An Eastern Air Lines Lockheed L-1011 crashed at 2342 eastern standard time, December 29, 1972, approximately 18 miles west- northwest of ...
  81. [81]
    [PDF] final report - FAA Safety
    Mar 23, 1977 · The report is about the collision of KLM Flight 4805 and Pan American Flight 1736 at Tenerife Airport on March 27, 1977, after being diverted ...
  82. [82]
    [PDF] NATIONAL TRANSPORTATION SAFETY .- .- . ..Y BOARD
    The collision of two Boeing 747% on a runway at the Tenerife, Canary Islands, Spain, airport on March 27, 1977, caused 583 ... Miscommunication occurred in ...
  83. [83]
    [PDF] AIRCRAm ACCIDENT REPORT - NTSB
    On December 28, 1978, United Airlines, Inc., Flight 173, a McDon- nell ... United Airlines studied fuel burn performance for the accident flight. Ln.
  84. [84]
    [PDF] AIRCRAm ACCIDENT - Federal Aviation Administration
    He reported about 7,000 lbs of fuel on board and stated his intention to hold for another 15 or 20 minutes. He stated that he was going to have the flight ...
  85. [85]
    Accident Boeing 737-241 PP-VMK, Sunday 3 September 1989
    ### Summary of Varig Flight 254 Accident
  86. [86]
  87. [87]
    [PDF] Controlled Flight Into Terrain Korean Air Flight 801 Boeing ... - NTSB
    Aug 6, 1997 · Abstract: This report explains the accident involving Korean Air flight 801, a Boeing 747-300, which crashed into high terrain at Nimitz ...Missing: LOSA | Show results with:LOSA
  88. [88]
    How CRM Changed Korean Air - Scholarly Commons
    Nov 30, 2015 · This paper analyzes the accident from a CRM perspective and illustrates the changes CRM can make and has been made to Korean Air.Missing: LOSA | Show results with:LOSA
  89. [89]
    [PDF] Final Report - Federal Aviation Administration
    Jun 1, 2009 · The accident involved Air France flight AF 447, an Airbus A330-203, on June 1, 2009, from Rio de Janeiro to Paris. The investigation aimed to ...
  90. [90]
    Report: Air France 447 Crashed Due To Faulty Sensors, Pilot Error
    Jul 5, 2012 · Faulty warning systems and pilot error are to blame for the 2009 crash of Air France Flight 447 that killed all 228 people aboard.
  91. [91]
    [PDF] Accident Report - NTSB/AAR-14/01 PB2014-105984
    Jul 6, 2013 · The airplane was destroyed. Safety issues relate to the need for Asiana pilots to adhere to standard operating procedures regarding callouts; ...
  92. [92]
    DCA13MA120.aspx - NTSB
    Jun 24, 2014 · Asiana Airlines flight 214 was on approach to runway 28L when it struck a seawall at San Francisco International Airport (SFO), San Francisco, California.
  93. [93]
    [PDF] Final Report - BEA
    Mar 24, 2015 · Task Force to look into the accident to Germanwings flight 9525 including the findings of the BEA's preliminary investigation report. The ...Missing: error | Show results with:error
  94. [94]
    Germanwings crash leaves unanswered questions - BBC News
    Mar 23, 2017 · We know that Germanwings Flight 4U 9525 was flown into a mountain by co-pilot Andreas Lubitz, 27, after he locked the plane's pilot out of the cockpit.Missing: error | Show results with:error
  95. [95]
    Aviation Accidents - Index of Months - NTSB
    NTSB aviation accident summaries are sorted by accident date and available by month, with an interactive dashboard also available.CAROL Query · Dashboard · May · AviationQuery.aspx