Triage
Triage is the process of sorting patients into priority categories based on the severity of their medical conditions to allocate limited resources efficiently, particularly in emergencies where demand exceeds capacity.[1] The term derives from the French verb trier, meaning "to sort" or "to cull," with early non-medical uses in classifying commodities like wool, evolving into medical application during periods of high casualties.[2] Pioneered in military contexts during the Napoleonic Wars, triage was systematized by French surgeon Dominique Jean Larrey, who by 1812 implemented rapid assessment and categorization of wounded soldiers into three groups—those requiring immediate care, those who could wait, and those unlikely to survive—to facilitate evacuation via "flying ambulances" and reduce battlefield mortality.[2][3] This approach marked a shift from rank-based treatment to urgency-driven prioritization, laying the foundation for modern systems that emphasize utilitarian outcomes: directing interventions to patients with the highest likelihood of survival to maximize overall lives saved.[4] In contemporary practice, triage is applied in emergency departments using validated scales like the Emergency Severity Index (ESI), in mass casualty incidents via protocols such as Simple Triage and Rapid Treatment (START), and during pandemics for ventilator allocation, where empirical data from historical implementations demonstrate improved resource utilization and survival rates despite ethical tensions over excluding lower-prognosis cases.[1][4] Controversies persist regarding criteria—such as incorporating age, frailty, or comorbidities—which can appear discriminatory but are defended on causal grounds of probable benefit, as peer-reviewed analyses underscore the moral imperative to prioritize collective utility in scarcity over equal treatment.[5][6]Fundamentals
Etymology and Origins
The term "triage" originates from the French word triage, denoting the act of sorting or selecting items, initially applied to agricultural processes such as separating coffee beans or wool by quality.[7] This derives from the Old French verb trier, meaning "to pick" or "to cull," with roots traceable to the 18th century in non-medical contexts.[7] The medical adaptation of "triage" emerged around 1792, coinciding with its integration into French military medical manuals for prioritizing casualties amid overwhelming numbers.[8] The practice of triage in medicine traces its formal origins to late 18th-century military medicine, primarily through the innovations of French surgeon Dominique Jean Larrey (1766–1842), Napoleon's chief physician. Larrey developed a systematic classification of wounded soldiers into three categories: those needing immediate intervention to prevent death, those stable enough to delay treatment, and those with injuries deemed irrecoverable, thereby optimizing limited resources on chaotic battlefields.[2] This approach was embedded in his "flying ambulance" system, which emphasized rapid field assessment and evacuation, marking a shift from treating patients in arrival order to urgency-based prioritization.[9] Larrey's methods, refined during Napoleonic campaigns from 1792 onward, established triage as a foundational emergency response strategy, influencing subsequent military and civilian applications despite predating the term's widespread English adoption in World War I.[10][11]Core Principles and Objectives
The primary objective of triage is to maximize overall survival rates by allocating limited medical resources to patients most likely to benefit from immediate intervention, thereby achieving the greatest good for the greatest number in resource-constrained environments such as mass casualty incidents or overwhelmed emergency departments.[4] This approach prioritizes clinical potential for recovery over egalitarian or first-arrival principles, focusing on those whose treatment yields the highest probability of positive outcomes while deferring or withholding care from others to preserve resources.[10] Historical military applications underscored conserving manpower and equipment as secondary goals, ensuring sustained operational capacity amid surges in demand.[10] Core principles emphasize rapid, accurate patient categorization to enable swift decision-making, typically completed in under 60 seconds per individual during primary assessments.[1] Key elements include evaluating vital signs (e.g., respiratory rate, pulse, and perfusion), mental status, and gross injuries to assign priority levels, often via color-coded systems: red for immediate life-threatening conditions requiring intervention within minutes, yellow for delayed but serious cases amenable to treatment after stabilization, green for minor injuries needing minimal resources, and black for expectant patients with low salvageability despite care.[12] These principles demand high accuracy to avoid under- or over-triage, which could squander resources or miss salvageable cases, while maintaining brevity to handle high volumes without compromising equity in resource distribution.[12][13] Triage operates as a dynamic, iterative process, incorporating reassessment as patient conditions evolve or additional resources become available, to optimize outcomes in fluid scenarios like disasters.[1] Effective implementation requires experienced personnel with clinical judgment to oversee resource control, ensuring decisions align with population-level survival rather than individual advocacy.[4] This framework underpins both prehospital and facility-based applications, adapting to contexts such as chemical incidents where exposure risks further influence prioritization.[12]Primary Assessment Techniques
The primary assessment in triage constitutes the initial, rapid evaluation of patients to detect and address immediate life-threatening conditions, typically completed within 60 seconds to minutes depending on the setting. This process prioritizes physiological stability over detailed history-taking, employing a structured sequence to minimize mortality by intervening in threats to vital functions. Core techniques include visual inspection for gross injuries or distress, verbal queries for responsiveness, and manual checks of key vital signs, adapted for both individual and mass casualty scenarios.[1] A foundational method is the ABCDE approach, which systematically evaluates Airway patency, Breathing adequacy, Circulation status, Disability or neurological function, and Exposure for concealed injuries while mitigating hypothermia. Airway assessment begins by tilting the head or using a jaw thrust to ensure unobstruction, checking for foreign bodies, stridor, or cyanosis; interventions like suctioning or basic maneuvers precede advanced airway management if needed. Breathing evaluation involves observing chest rise, counting respiratory rate (normal adult range: 12-20 breaths per minute), and auscultating for bilateral air entry, with rates exceeding 30 or below 10 indicating potential immediate category assignment in field triage.[14][15] Circulation assessment focuses on palpable pulses (e.g., radial or carotid), skin perfusion via capillary refill time (normal <2 seconds), and hemorrhage control, as uncontrolled bleeding accounts for up to 90% of preventable combat deaths per military data. Disability screening uses the AVPU scale—Alert, responds to Voice, Pain, or Unresponsive—or Glasgow Coma Scale elements for mental status, pupil reactivity, and gross motor function. Exposure requires brief undressing to inspect for injuries, balanced against environmental risks, as hypothermia exacerbates shock in up to 20% of trauma cases. These techniques, validated in emergency protocols since the 1970s, enable categorization into immediate, delayed, minimal, or expectant priorities, with inter-rater reliability improved by standardized tools like START, where respiratory rate and perfusion metrics alone triage over 70% of cases accurately in simulations.[16][17]Triage Methodologies
Simple Triage Protocols
Simple triage protocols involve rapid, initial categorization of patients in mass casualty incidents to prioritize those requiring immediate intervention while conserving limited resources.[1] These methods, typically completed within 30 to 60 seconds per patient, rely on basic physiologic assessments rather than detailed diagnostics.[18] One widely adopted protocol is START (Simple Triage and Rapid Treatment), originally designed for field use by emergency responders arriving at scenes with multiple victims.[19] In START, ambulatory patients are first directed to a designated area and tagged as minimal (green), representing walking wounded with minor injuries who can delay care without significant risk.[20] For non-ambulatory individuals, triage proceeds via the RPM assessment: respiration (assess rate and effort; absent respirations after airway positioning indicate expectant/dead—black—or immediate—red—if respirations exceed 30 per minute), perfusion (check radial pulse or capillary refill; absence or refill over 2 seconds tags immediate—red), and mental status (inability to follow simple commands tags immediate—red; responsive but not immediate tags delayed—yellow).[21] Patients tagged immediate (red) exhibit life-threatening conditions amenable to rapid stabilization, such as airway compromise or shock, demanding prompt evacuation.[1] Delayed (yellow) patients have serious but non-imminent threats, allowing deferred treatment.[20]| Category | Color Tag | Criteria Summary |
|---|---|---|
| Immediate | Red | Respirations >30/min, no radial pulse/capillary refill >2s, or fails to obey commands after passing prior steps.[19] |
| Delayed | Yellow | Fails one RPM criterion but stabilizes with basic intervention or has injuries not immediately life-threatening.[20] |
| Minimal | Green | Ambulatory or passes all RPM assessments.[21] |
| Expectant/Dead | Black | No respirations after airway maneuver, or unsurvivable injuries with resource diversion futile.[1] |
Advanced and Secondary Triage
Secondary triage refers to the reevaluation of patients following primary triage, typically occurring after initial stabilization, transport to a treatment area, or arrival at a secondary care facility such as a hospital emergency department.[26] This process refines initial categorizations by reassessing vital signs, response to early interventions, and evolving clinical needs, allowing for upgrades or downgrades in priority to optimize resource allocation.[27] In mass casualty incidents, secondary triage is applied in staging or treatment zones, where patients previously sorted via protocols like START undergo brief reexaminations to account for changes over time, such as deterioration in delayed cases or improvement post-fluid resuscitation.[28] Evidence from disaster response analyses indicates that secondary triage reduces overtriage errors, with re-triage rates showing 10-20% of patients shifted categories in prolonged events exceeding 24 hours.[29] Advanced triage extends beyond basic vital sign checks by incorporating protocol-driven initiation of diagnostics and interventions at the point of entry, often performed by nurses or physicians in emergency departments.[30] These protocols target intermediate-acuity patients (e.g., ESI levels 3-4), authorizing actions like laboratory tests, imaging, or medications without full physician evaluation, thereby shortening time to decision-making.[31] A 2022 study of an advanced triage protocol in a U.S. emergency department, applied to patients meeting criteria such as age over 18 and specific chief complaints, demonstrated a 25% reduction in door-to-provider time for eligible cases.[32] Implementation analyses from European settings report decreased overall length of stay by 15-30 minutes per patient and higher satisfaction scores, attributed to parallel processing of assessments rather than sequential waits.[33] However, efficacy depends on staff training and protocol adherence, with underutilization risks in high-volume surges leading to persistent bottlenecks.[34] In practice, secondary and advanced triage overlap in hospital inflows from field operations, where secondary reassessment informs advanced interventions; for instance, trauma patients triaged primarily via RPM (respiration, perfusion, mental status) metrics are secondarily evaluated for surgical readiness using tools like the Revised Trauma Score.[4] Peer-reviewed evaluations emphasize that these methods enhance throughput without compromising outcomes, as measured by mortality rates remaining under 2% in triaged cohorts versus higher in untriaged overloads.[35] Limitations include dependency on accurate primary inputs, with errors propagating if initial field assessments overlook subtle deteriorations like occult hemorrhage.[36]Reverse and Field Triage Variants
Reverse triage, also known as reverse patient flow or internal surge capacity management, involves systematically assessing and discharging stable hospitalized patients to free up beds and resources for incoming critically ill or injured individuals during periods of hospital overcrowding or mass casualty events. This approach contrasts with conventional forward triage by focusing on existing inpatients rather than new arrivals, aiming to create capacity within 24-96 hours without compromising patient safety for those identified as low-risk for deterioration. A 2023 systematic review of 14 studies, primarily simulations and retrospective analyses, found that reverse triage protocols typically categorize patients using criteria such as vital sign stability, minimal need for invasive interventions, and low acuity scores (e.g., via modified Early Warning Scores), enabling safe discharge or transfer rates of 10-30% in modeled scenarios. Implementation has been tested in contexts like emergency department crowding and pandemics, with one European study reporting reduced boarding times for new admissions after applying nurse-led reverse triage during surges. However, real-world adoption remains limited due to challenges in predicting post-discharge outcomes and legal-ethical concerns over potential readmissions, as evidenced by simulation data showing readmission risks under 5% for screened patients but higher in unmodeled comorbidities.[37][38][39] In military settings, reverse triage adapts these principles to combat environments, prioritizing the return to duty of lightly wounded personnel over treating those with poor prognoses when resources are constrained, a concept formalized in U.S. military doctrine since the early 2000s to maximize operational readiness. For instance, during prolonged engagements, protocols may deprioritize patients requiring extensive long-term care, redirecting ventilators or surgical slots to those with higher survival and functional recovery potential, based on injury severity scores and resource utility calculations. Empirical data from Iraq and Afghanistan conflicts indicate that such variants reduced treatment delays for salvageable cases by up to 40% in forward operating bases, though they raise ethical debates on utilitarian allocation absent civilian oversight.[40] Field triage variants encompass prehospital protocols designed for rapid assessment and transport decisions in austere or high-volume incident sites, such as trauma scenes or disasters, where emergency medical services (EMS) personnel evaluate patients to direct them to appropriate facilities like trauma centers. The U.S. National Guidelines for the Field Triage of Injured Patients, updated in 2021 by a multidisciplinary panel including the CDC and ACS, structure this into four sequential steps: physiologic derangements (e.g., Glasgow Coma Scale <14 or systolic BP <90 mmHg), anatomic injuries (e.g., flail chest or penetrating torso wounds), injury mechanisms (e.g., falls >20 feet or ejection from vehicles), and special patient considerations (e.g., age >65 or anticoagulant use), recommending transport to Level I/II trauma centers for those meeting criteria to minimize mortality. These guidelines, derived from evidence reviews of over 100 studies, aim to balance under-triage (missing severe cases, targeted <5-10%) and over-triage (unnecessary transfers, acceptable 25-35% for sensitivity), with field data from 2010-2020 showing compliance variations by region but overall reductions in trauma mortality by 15-25% in systems adhering strictly.[41][42][43] Variants like the Simple Triage and Rapid Treatment (START) system, developed in the 1980s for mass casualties, simplify field decisions using 30-second assessments of respiration, perfusion, and mental status to classify as immediate, delayed, minimal, or expectant, proven effective in events like the 2010 Haiti earthquake where it facilitated sorting over 10,000 victims with overtriage rates under 20%. SALT (Sort, Assess, Lifesaving Interventions, Treatment/Transport), endorsed by the American College of Emergency Physicians since 2008, incorporates dynamic rescuer safety and integrates minimal interventions pre-transport, showing in drills a 10% improvement in categorization accuracy over static methods. Regional adaptations, such as Arizona's field triage protocol emphasizing mechanism-of-injury thresholds, report transport accuracies exceeding 90% in rural settings, underscoring the causal link between timely field prioritization and outcomes like reduced hemorrhagic shock deaths. Limitations include inter-rater variability (up to 15% in physiologic assessments) and challenges in non-trauma scenarios, prompting ongoing refinements via data from national EMS registries.[44][45]Specialized Applications
In burn mass casualty incidents, triage protocols emphasize rapid estimation of total body surface area (TBSA) burned, inhalation injury risks, and concurrent trauma to predict survivability and resource needs. The Fast Triage in Burns (FTB) algorithm, introduced in 2018 for civilian events, categorizes patients as minor (treatable with limited resources), moderate (requiring specialized burn care), or major (limited survival prospects despite intensive intervention) based on simplified TBSA assessments and vital signs.[46] Dynamic, multi-phased triage is recommended, with initial sorting followed by reassessments as injuries evolve, such as fluid resuscitation needs exceeding standard formulas in disasters.[47] The American Burn Association advocates regional surge plans to distribute patients, as single facilities can become overwhelmed; for instance, events like the 2015 Coahuila gas explosion highlighted failures in coordinated transfer, leading to high mortality.[48][49] Pediatric triage in mass casualty incidents adapts adult systems to account for children's higher metabolic rates, larger head-to-body ratios, and challenges in behavioral assessment. The JumpSTART algorithm, widely used in the United States since the early 2000s, prioritizes immediate (red), delayed (yellow), minimal (green), and expectant (black) categories via checks for spontaneous respirations (>30/min abnormal), perfusion (capillary refill >2 seconds or absent radial pulse), and mental status (failure to localize pain).[50][51] It outperforms adult tools like START in simulations, with studies showing under-triage risks in children due to subtle signs; secondary triage refines initial assignments during resource allocation.[52] In real-world applications, such as school shootings or blasts, JumpSTART facilitates rapid sorting of 25-100 victims, emphasizing airway repositioning and non-verbal cues for infants.[53] For chemical, biological, radiological, and nuclear (CBRN) incidents, triage integrates decontamination sequencing to mitigate secondary exposure before medical prioritization, differing from conventional trauma by focusing on agent-specific toxidromes and latency periods. Protocols mandate gross decontamination for all exposed casualties prior to detailed assessment, using categories like immediate (life-threatening symptoms), delayed (stable but contaminated), minimal (ambulatory), and expectant (overwhelmed resources).[54][55] In military and civilian guidelines, such as those from CHEMM, triage includes surveying for sweating, convulsions, or blast trauma alongside chemical signs, with antidotes prioritized for nerve agents over supportive care alone.[56] The U.S. Army employs distinct sorting for treatment, decontamination, and evacuation, as standard methods like SALT may delay care in contaminated environments; exercises demonstrate that CBRN triage extends processing time by 1-2 minutes per patient due to protective gear.[57][58]Historical Development
Pre-Modern Practices
The earliest known precursor to triage practices is documented in the Edwin Smith Papyrus, an ancient Egyptian surgical treatise dating to circa 1600 BCE, though its content derives from older traditions possibly originating around 2500 BCE. This text details 48 cases of injuries and ailments, progressing systematically from head to foot, with each case structured around title, examination, diagnosis, prognosis, and treatment recommendations. Notably, cases are implicitly prioritized through prognostic verdicts: "an ailment I will treat" for favorable outcomes, "an ailment with which I will contend" for uncertain or difficult cases, and "an ailment not to be treated" for hopeless conditions. This categorization enabled physicians to allocate limited resources—such as time, herbs, and bandages—toward patients with viable prospects, reflecting a rudimentary form of outcome-based sorting amid scarce medical capabilities.[59][60] Such assessments likely emerged in contexts of trauma from warfare, labor accidents, or daily hazards in ancient Egyptian society, where empirical observation of wound healing and vital signs informed decisions on intervention viability. The papyrus eschews supernatural explanations for these cases, emphasizing observable symptoms like pulse, wound appearance, and neurological deficits—e.g., paralysis or speech loss in head injuries—over magical incantations found in contemporaneous texts. This prognostic framework minimized futile efforts on irrecoverable patients, conserving communal resources for those likely to contribute to society post-recovery, a causal logic aligning with efficient care distribution under constraints.[59][61] Evidence of comparable systematic practices in other pre-modern civilizations remains scant, with ancient Greek and Roman medicine—drawing from Hippocratic principles of prognosis and humoral balance—focusing more on individual patient management than mass sorting. Ad hoc decisions to abandon severely wounded soldiers on battlefields occurred across ancient armies, but lacked the documented, case-based methodology of the Edwin Smith Papyrus. By the medieval period, plague responses in Europe involved isolating the infectious, yet formalized triage for mixed casualties did not materialize until military innovations in the 18th century. Thus, pre-modern triage manifested primarily as prognostic triage in isolated, elite medical documentation rather than scalable protocols for overwhelming demand.[8][10]Military Foundations in the Modern Era
The foundations of modern military triage were established by French surgeon Dominique-Jean Larrey during the Napoleonic Wars (1803–1815), where he served as chief surgeon to Napoleon's Grande Armée. Larrey introduced a systematic prioritization of wounded soldiers based on injury severity and likelihood of survival, rather than military rank or arrival order, to maximize overall battlefield effectiveness under resource constraints. This approach addressed the chaos of mass casualties, with Larrey's units treating thousands amid battles involving up to 400 engagements across 25 campaigns.[3][62] Larrey's triage system categorized patients into those requiring immediate intervention for limb or life-threatening wounds, those treatable after stabilization, and those deemed unsalvageable, enabling efficient allocation of limited surgical personnel and facilities. Complementing this, he pioneered "flying ambulances"—light, horse-drawn wagons designed for swift forward evacuation from the front lines to mobile surgical units, reducing mortality from shock and hemorrhage by minimizing transport delays. These innovations, implemented as early as the 1798 Egyptian campaign and refined through conflicts like Austerlitz (1805) and Waterloo (1815), marked a shift from ad hoc medieval practices to evidence-based, casualty-focused protocols grounded in observed outcomes.[63][64] In the mid-19th century, these principles influenced triage in subsequent conflicts, such as the Crimean War (1853–1856), where Russian surgeon Nikolay Pirogov formalized graded categorization under fire, treating over 10,000 wounded with a system emphasizing anatomical injury assessment. Similarly, during the American Civil War (1861–1865), Union and Confederate surgeons adopted prioritization at aid stations, processing casualties numbering over 600,000 total wounded, though without the term "triage" until World War I; this relied on rapid field sorting to direct evacuees to regimental hospitals or rear facilities. These applications validated Larrey's causal framework—that timely intervention on viable cases preserved combat strength—despite persistent challenges like infection rates exceeding 50% pre-antiseptics.[11][9]Evolution Through 20th-Century Conflicts
In World War I, triage practices advanced significantly amid the unprecedented scale of casualties on the Western Front. Belgian surgeon Antoine De Page formalized an orderly triage system in 1914 at the Hôpital de l'Océan in De Panne, Belgium, where wounded soldiers were sorted upon arrival into categories based on injury severity and treatment urgency, prioritizing those with operable wounds while implementing early surgical intervention and antiseptic protocols to combat infection.[65] This approach contrasted with earlier ad-hoc methods, emphasizing rapid assessment to allocate limited surgical resources effectively, as French military medicine had begun applying triage sorting by categorizing casualties into urgent, emergent, and delayed groups.[9] British and Allied forces adopted similar protocols in casualty clearing stations, where triage decisions determined immediate evacuation or field treatment, reducing mortality from shock and sepsis through systematic prioritization.[66] World War II saw further refinements in triage amid mechanized warfare and larger-scale operations, with U.S. and Allied forces integrating it into forward echelons for battlefield sorting. Triage at aid stations focused on stabilizing patients for evacuation, categorizing them by physiological criteria such as respiratory and circulatory status to direct resources toward salvageable cases, while deferring minor wounds.[8] German and Axis forces employed comparable systems, but Allied advancements in plasma transfusion and penicillin influenced triage by enabling more aggressive resuscitation of borderline cases, though ethical debates arose over "expectant" categories for the mortally wounded.[67] By war's end, triage had evolved to include scene-level assessments by initial responders, laying groundwork for mass casualty protocols beyond pure military contexts.[8] The Korean War (1950–1953) marked a pivotal shift with the deployment of Mobile Army Surgical Hospitals (MASH) units, which operated within the "golden hour" doctrine facilitated by helicopter evacuations, allowing triage to prioritize rapid transport over extensive field treatment.[68] At collecting and clearing stations, medics performed initial triage using simple vital signs checks to classify casualties as immediate, delayed, or minimal, before forwarding them to MASH for definitive care, achieving a battle injury mortality rate drop to 4.5% from prior wars' 8–10%.[69] MASH triage emphasized surgical readiness, with teams sorting arrivals to optimize operating room throughput, though overloads strained categorization accuracy.[70] In the Vietnam War (1955–1975), triage adapted to jungle warfare and high-velocity wounds, bolstered by Dustoff helicopter medevac systems that evacuated casualties within 1–2 hours, compressing the timeline for assessments and enabling forward-area triage focused on hemorrhage control and airway management.[71] Hospital ships and base triage areas used expanded criteria, including neurological status, to prioritize among surges, with nurses and physicians directing flows in emergency receiving to prevent bottlenecks.[72] This era's emphasis on speed reduced preventable deaths to under 2%, but highlighted challenges like overtriage of minor cases due to rapid evacuations, prompting post-war refinements in protocols.[73]Civilian and Disaster Applications Pre-2000
In civilian emergency departments, triage emerged as a formalized process in the mid-20th century, adapting military sorting principles to manage patient influxes where resources were strained. Prior to the 1960s, hospital emergency rooms often operated on a first-come, first-served basis, leading to inefficiencies in treating high-acuity cases amid growing urban demand. The pivotal shift occurred in 1964, when Edward R. Weinerman and colleagues published the first systematic analysis of civilian emergency department triage based on observations in New Haven, Connecticut, hospitals; they advocated for initial assessments prioritizing patients by physiological stability, vital signs, and injury severity to optimize outcomes in overcrowded settings.[10] [1] This approach typically involved nurses conducting brief evaluations—focusing on airway, breathing, circulation, and mental status—to classify patients into emergent, urgent, or non-urgent categories, thereby reducing delays for those at immediate risk of deterioration.[8] By the 1970s and 1980s, triage became a standard role in U.S. and European hospitals, with dedicated triage stations at ED entrances to stratify care and allocate limited staff and beds efficiently. Empirical data from these decades indicated that acuity-based triage lowered overtriage rates (assigning low-acuity patients unnecessary high-priority resources) to around 10-20% in busy departments, while improving survival for critical cases through faster interventions like resuscitation.[1] Systems emphasized reproducibility, with tools such as vital sign checklists ensuring consistency across shifts, though challenges persisted in subjective assessments of pain or chronic conditions. International adoption followed, as seen in early implementations in Canadian and Australian EDs, where similar protocols addressed seasonal surges in trauma from accidents and illnesses.[8] For disaster and mass casualty applications pre-2000, civilian responders adapted battlefield triage to non-military events like industrial accidents, natural calamities, and transportation crashes, focusing on rapid field sorting when conventional care capacity was overwhelmed. Core principles involved color-coded categorization—red for immediate (life-threatening but salvageable), yellow for delayed (serious but stable), green for minimal (walking wounded), and black for expectant (unsalvageable)—to direct limited personnel toward maximizing survivors.[74] The Simple Triage and Rapid Treatment (START) protocol, introduced in 1983 by the Newport Beach Fire Department in collaboration with Hoag Hospital, California, exemplified this evolution; it enabled lay and medical personnel to triage hundreds in minutes using simple criteria: inability to follow commands, respiratory distress (>30/min or <10/min), or poor radial pulse perfusion.[75] Applied in events such as U.S. chemical plant explosions and European train derailments, START demonstrated field accuracy rates of 70-90% in retrospective analyses, though undertriage risks arose in chaotic environments with incomplete assessments.[8] These methods prioritized causal factors like treatable shock over egalitarian distribution, reflecting resource realism in scenarios where victim numbers exceeded transport and treatment availability by factors of 10 or more.[10]Contemporary Systems and Frameworks
Key Methodological Models
The Simple Triage and Rapid Treatment (START) protocol, developed in the 1980s by the Newport Beach Fire Department, serves as a foundational model for mass casualty incidents, enabling rapid categorization of patients into four groups—immediate (red, requiring urgent intervention for survivable injuries), delayed (yellow, non-life-threatening but needing care), minimal (green, walking wounded), and expectant (black, unlikely to survive given resource constraints)—based on respiration rate (>30 or <10 breaths per minute indicates immediate), perfusion (capillary refill >2 seconds or radial pulse absent), and mental status (inability to follow commands).[19][76] This 60-second assessment prioritizes physiologic stability over detailed diagnostics, with modifications in 1996 incorporating pediatric adaptations like JumpSTART.[19] Empirical evaluations indicate START's sensitivity for identifying immediate patients ranges from 72% to 92%, though specificity for delayed categories can vary, leading to potential overtriage in low-acuity scenarios.[77] The Sort, Assess, Lifesaving Interventions, Treatment/Transport (SALT) model, endorsed by the Centers for Disease Control and Prevention since 2008, refines START by explicitly incorporating an initial "sort" phase to separate the walking wounded, followed by lifesaving interventions (e.g., hemorrhage control) before reassessment, addressing START's limitations in dynamic environments.[78] SALT categorizes similarly but includes a formalized expectant (gray) designation for those with profound irreversible conditions, aiming to reduce misclassification; a 2017 study of simulated incidents found SALT achieved higher overall accuracy (78% vs. 71% for START) in classifying immediate and delayed patients against expert reference standards.[79][80] However, inter-rater reliability remains challenged by subjective elements like mental status, with field trials showing consistency rates of 65-85% across responders.[81] In hospital emergency departments, the Emergency Severity Index (ESI), a five-level algorithm introduced in 1999 and revised through version 5 in 2012, stratifies patients by acuity (level 1: resuscitation needed immediately; level 5: non-urgent, minimal resources) and expected resource consumption, integrating vital signs, chief complaint, and pain scale after initial stability checks.[82][83] Validation studies report ESI's predictive validity for hospitalization at 0.78-0.85 AUC, outperforming unstructured triage in reducing undertriage (defined as missing high-acuity cases) to under 5% in U.S. EDs, though it demands trained nursing staff and may inflate level 2 assignments due to broad "high-risk" criteria.[84][85] Military and tactical contexts employ the MARCH algorithm within Tactical Combat Casualty Care (TCCC) guidelines, updated iteratively since 1996 by the U.S. Department of Defense's Joint Trauma System, sequencing interventions as Massive hemorrhage control (e.g., tourniquets), Airway management, Respiration (chest seals for tension pneumothorax), Circulation (fluid resuscitation), and Hypothermia/Head injury prevention to address preventable deaths, which constitute 90% of battlefield fatalities from extremity hemorrhage and airway issues.[86] Field data from Iraq and Afghanistan conflicts demonstrate MARCH's causal impact in lowering hemorrhage mortality from 7-10% pre-implementation to under 2%, emphasizing immediate bleeding arrest over traditional ABC sequencing.[87] These models collectively underscore triage's reliance on observable physiologic markers for causal prioritization, yet prospective studies highlight persistent efficacy gaps, with aggregate undertriage rates of 10-20% across systems due to responder variability and incomplete vital sign data.[76][79]Regional and National Variations
In the United States, emergency department triage predominantly employs the Emergency Severity Index (ESI), a five-level system introduced in 1999 that assesses acuity based on vital signs, chief complaints, and anticipated resource utilization, with Level 1 indicating immediate life-saving interventions and Level 5 representing minimal urgency.[88] For prehospital and mass casualty scenarios, the Simple Triage and Rapid Treatment (START) protocol, developed in the 1980s by the San Diego Fire Department and endorsed in national field triage guidelines updated by the Centers for Disease Control and Prevention as of 2021, prioritizes patients using simple physiologic criteria like respiration, perfusion, and mental status within 60 seconds per individual.[89] These tools emphasize rapid categorization to optimize transport to appropriate facilities, though regional implementation varies due to state-specific EMS protocols.[90] Canada utilizes the Canadian Triage and Acuity Scale (CTAS), a five-category framework implemented nationally since 1999, which incorporates clinical discriminators, vital signs, and expected physician intervention times—such as 0 minutes for Resuscitation (Level I) and up to 240 minutes for Non-urgent (Level V)—to guide resource allocation in overcrowded emergency settings.[88] This system, validated through inter-rater reliability studies showing kappa values around 0.75-0.85, differs from U.S. models by mandating acuity-linked wait-time targets enforceable across provinces.[91] In Europe, the Manchester Triage System (MTS), originating in the United Kingdom in 1997 and adopted in countries including the Netherlands, Portugal, and parts of Germany, relies on 52 symptom-specific flowcharts to assign one of five priority categories, with discrimination based on presenting complaints rather than resource needs, achieving sensitivity rates of 77-95% for high-acuity cases in validation studies.[88] Sweden, however, favors the Rapid Emergency Triage and Treatment System (RETTS), a vital-signs-driven model with algorithmic pathways that integrates early warning scores, used in over 90% of emergency departments as of 2021 surveys, though national variations persist in tool customization and nurse training requirements.[92] Australia and New Zealand implement the Australasian Triage Scale (ATS), a five-level acuity tool since 1993 that defines categories by potential adverse outcomes and treatment timelines—e.g., 10 minutes for Immediate (Category 1)—and incorporates pediatric and geriatric modifiers, with empirical data indicating overtriage rates of 20-30% in high-volume settings to err toward patient safety.[88] In Asia, adoption often involves adaptations of Western systems like ESI or CTAS, but validation studies across cohorts in Japan, South Korea, and Southeast Asia reveal lower predictive accuracy (e.g., AUC values of 0.70-0.80 versus 0.85+ in origin populations) due to demographic and infrastructural differences, prompting localized refinements in protocols for urban versus rural applications.[93] These national frameworks reflect causal influences like healthcare funding models, litigation risks, and disaster frequency, with less formalized triage in resource-limited regions of South Asia and sub-Saharan Africa relying on basic ABC assessments absent standardized scales.[94]Integration of Technology and AI
Technology has augmented traditional triage processes by enabling electronic data capture, real-time vital sign monitoring, and algorithmic decision support, reducing reliance on manual assessments prone to human error. Electronic triage systems, such as mobile apps and wearable integrations, allow for rapid input of patient data including heart rate, blood pressure, and oxygen saturation, facilitating START or ESI scoring in field or emergency department (ED) settings. For instance, e-triage platforms deployed in mass casualty incidents (MCIs) provide continuous vital sign tracking via Bluetooth-enabled devices, improving prioritization over static paper tags.[95] Artificial intelligence, particularly machine learning (ML) models, has emerged to predict patient acuity and disposition at triage, often outperforming conventional nurse assessments. ML algorithms trained on structured data like age, vital signs, and comorbidities, supplemented by natural language processing (NLP) of free-text notes, achieve triage accuracies of 75.7% compared to 59.8% for manual methods, with models like XGBoost and random forests excelling in forecasting hospital length of stay or critical illness. In EDs, these systems integrate with electronic health records to flag high-risk patients, enabling proactive resource allocation and reducing overtriage rates. Peer-reviewed validations, however, emphasize the need for multi-center testing to mitigate overfitting to specific datasets.[96][97][98] In MCIs, AI-driven tools leverage computer vision and unmanned aerial vehicles (UAVs) for remote triage, using algorithms like OpenPose for posture analysis and YOLO for object detection to categorize casualties without direct contact, enhancing efficiency in hazardous environments. Mobile triage apps with GPS and injury pattern logging further support coordinated responses, as demonstrated in simulations where AI dashboards visualized real-time data to minimize undertriage. Despite these advances, challenges persist, including the "black-box" opacity of ML decisions, which can undermine clinician trust, and the requirement for ethical benchmarking to ensure utilitarian prioritization in resource-scarce scenarios. Ongoing developments focus on explainable AI to provide rationales for predictions, aligning with empirical validation standards.[99][100][101]Applications in Crises
Military and Combat Scenarios
In military and combat scenarios, triage prioritizes casualties based on injury severity, resource availability, and operational demands, often under active threat to providers and with extended evacuation times. This process aims to allocate limited medical assets to those with the highest likelihood of survival from treatable conditions, while sustaining unit combat effectiveness. Protocols emphasize rapid, repeatable assessments to sort multiple casualties efficiently.[102] The U.S. Department of Defense's Tactical Combat Casualty Care (TCCC) framework integrates triage into phased care: care under fire (minimal interventions while suppressing threats), tactical field care (detailed assessments away from immediate danger), and tactical evacuation care. For mass casualties, TCCC recommends a simple triage algorithm evaluating ambulatory status, respiration (rate and effort), perfusion (radial pulse presence), and mental status (ability to follow commands). Casualties unable to walk undergo further checks; those with respiratory compromise, absent radial pulse, or altered mentation receive immediate priority.[103][104] Standard categories include immediate (life-threatening but potentially survivable injuries like massive hemorrhage or tension pneumothorax), delayed (serious wounds stable for delayed treatment), minimal (minor injuries requiring self-aid or buddy-aid), and expectant (injuries unlikely to benefit from available resources, such as decapitation or multiple gunshot wounds to vital areas). Expectant casualties receive palliative measures and periodic re-triage if conditions improve. This system, validated through conflicts like Iraq and Afghanistan, supports low killed-in-action rates by focusing interventions on reversible causes of death.[104] Military triage adapts civilian models like START for austere environments, incorporating "reverse triage" to evacuate fitter casualties first, preserving combat force multipliers. A 2025 narrative review of military literature highlighted reverse triage's role in aligning medical decisions with command intent, though empirical implementation data remains limited. NATO's AMedP-1.10 standard promotes compatible triage tools across allies, recommending simple, non-technology-dependent methods.[40][105] Empirical studies indicate challenges, including undertriage risks in dynamic combat; one analysis of battlefield casualties using a field triage score found mortality rates rising from 0.2% for high scores to over 6% for low scores, underscoring the need for accurate initial sorting. Resource constraints and provider fatigue contribute to discrepancies, with triage decisions sometimes overridden by on-table reassessments during surgery.[106]Pandemics and Infectious Outbreaks
Triage during pandemics and infectious outbreaks prioritizes patients for limited critical resources such as ventilators and ICU beds when healthcare systems exceed surge capacity, aiming to maximize overall survival rather than treat all equally.[107] Protocols typically incorporate prognostic scoring systems like the Sequential Organ Failure Assessment (SOFA) to estimate likelihood of benefit from intensive interventions, with sequential reassessments to adjust allocations dynamically.[108] In contrast to trauma triage, infectious disease scenarios emphasize rapid screening for contagion risk to protect staff and facilities, often using exclusion criteria based on symptom onset and epidemiological exposure before deeper clinical evaluation.[109] During the 2014-2016 West Africa Ebola outbreak, triage algorithms focused on swift identification of high-probability cases to isolate them and minimize nosocomial transmission, incorporating variables such as time from symptom onset to presentation, fever duration, and contact history with confirmed cases.[110] A retrospective analysis of over 24,000 suspected cases in the Democratic Republic of Congo emphasized four priority variables and 13 scoring factors to prioritize admissions to Ebola treatment centers, reducing unnecessary exposure risks.[111] These methods achieved high specificity in ruling out non-cases while directing resources to those with confirmed or probable infection, though challenges persisted in resource-poor settings with delayed presentations.[112] The COVID-19 pandemic, beginning in late 2019, prompted widespread adoption of crisis standards of care, with protocols like Yale New Haven Health's using SOFA scores and exclusion for irreversible conditions to allocate scarce ventilators, prioritizing those with highest expected survival probability.[113] Empirical evaluations in U.S. cohorts showed these guidelines feasible but highlighted variability; for instance, SOFA-based triage in critically ill patients yielded survival predictions aligning with historical benchmarks, though overtriage risks increased under high-volume surges.[114] [115] Early triage emphasized separating suspected cases via four-level processes in emergency departments to curb transmission, with contingency plans for pre-hospital screening.[116] Studies reported undertriage rates for severe cases potentially exceeding 5% in overwhelmed systems, correlating with worse outcomes due to delayed intensive care access.[117] In both Ebola and COVID-19 contexts, triage integrated infection control, such as personal protective equipment mandates during assessments, to mitigate secondary outbreaks among providers.[118] Prognostic tools faced scrutiny for implicit biases, including age thresholds in some guidelines that limited elderly access, though utilitarian frameworks justified them by empirical survival data showing diminished returns on resources for low-prognosis groups.[119] Post-outbreak reviews underscored the need for validated, adaptable algorithms to balance equity and efficacy, with simulations indicating training reduces decision errors in high-stakes scenarios.[120]Mass Casualty and Disaster Response
In mass casualty incidents (MCIs) and disasters, triage systems prioritize victims to allocate limited resources toward those with the highest likelihood of survival, categorizing them into immediate (red), delayed (yellow), minimal (green), and expectant/dead (black) groups to optimize overall outcomes.[76] The Simple Triage and Rapid Treatment (START) system, developed in the 1980s by the San Diego Fire Department and Hoag Hospital, remains widely used in the United States for initial field sorting, assessing respiration rate, radial pulse, and mental status in under 60 seconds per patient.[76] Patients able to walk are designated minimal; those with respirations over 30 per minute or absent radial pulse despite airway support are immediate or expectant, respectively.[76] The SALT (Sort, Assess, Lifesaving Interventions, Treatment/Transport) protocol, endorsed by the Centers for Disease Control and Prevention in 2010 as a revision to START, begins with a "move to safety" directive followed by immediate lifesaving interventions like opening airways or controlling hemorrhage before categorization, aiming to reduce undertriage.[78] Simulations indicate SALT achieves higher accuracy, with undertriage rates 9% lower than START and better classification for delayed and immediate categories, though both systems exhibit sensitivity and specificity below 90% against reference standards.[24] [121] In pediatric cases during disasters, JumpSTART modifies START by incorporating age-specific criteria like capillary refill and respiratory effort.[76] Disaster response integrates triage with the Incident Command System (ICS), enabling coordinated multi-agency efforts across field, evacuation, and hospital phases, as seen in events like the 2010 Haiti earthquake where over 200,000 deaths overwhelmed systems despite protocol application.[122] Empirical evidence from real-world MCIs remains sparse, with most validation derived from simulations showing variable accuracy—START overtriage around 20-30% and undertriage up to 40%—highlighting limitations in dynamic environments with incomplete data or secondary hazards like aftershocks.[123] [121] Studies of 300 simulated Pakistani MCIs from 2010-2024 underscore that training reduces errors but does not eliminate logistical barriers such as provider fatigue or resource mismatches.[124] No triage system has demonstrated consistent superiority in large-scale disasters, prompting calls for hybrid models tailored to incident scale and etiology.[125]Limitations and Empirical Challenges
Rates of Undertriage and Overtriage
Undertriage occurs when patients requiring higher-priority care are assigned to lower acuity levels, potentially delaying life-saving interventions, while overtriage assigns lower-acuity patients to higher-priority categories, straining limited resources.[121] Empirical studies across emergency departments (EDs) and trauma settings reveal variable rates, with undertriage generally lower but more consequential due to risks of increased mortality. The American College of Surgeons Committee on Trauma (ACS-COT) recommends undertriage rates below 5% and overtriage below 25-35% in trauma systems to balance safety and efficiency.[126] [127] In U.S. EDs using the Emergency Severity Index (ESI) version 4, a 2023 analysis of over 5 million encounters found overall mistriage in 32.2% of cases, comprising 3.3% undertriage and 28.9% overtriage, with undertriage linked to higher hospitalization risks for affected patients.[84] Trauma-specific evaluations show higher undertriage, such as 20.3% in a national U.S. analysis of over 140,000 patients, where undertriaged cases were associated with demographic factors like Black race or Medicaid insurance, and 24% undertriage in a Level I trauma center validation using an overtriage/undertriage matrix.[128] [129] Geriatric trauma cohorts report elevated undertriage at 53.8%, contributing to delayed care and excess mortality due to atypical presentations in older adults.[130]| Context/System | Undertriage Rate | Overtriage Rate | Key Notes | Source |
|---|---|---|---|---|
| ESI v4 in U.S. EDs (2023) | 3.3% | 28.9% | Analyzed >5M encounters; undertriage raised hospitalization odds | [84] |
| National U.S. trauma (2023) | 20.3% | 22.2% | >140K patients; demographic disparities in overtriage | [128] |
| Level I trauma matrix (2017) | 24% | 45% | Of 2,282 high-ISS patients, 45% undertriaged to partial activation | [129] |
| Geriatric trauma (2025) | 53.8% | Not specified | Multicenter; linked to increased mortality | [130] |