Fact-checked by Grok 2 weeks ago

Task analysis

Task analysis is a systematic method employed in fields such as human factors engineering, , human-computer interaction (HCI), and to break down complex tasks into smaller, hierarchical subtasks, identifying the observable actions, cognitive processes, and environmental factors required for users to achieve their goals. This approach enables designers, engineers, and trainers to understand user behaviors, anticipate challenges, and optimize systems for , , and . Originating in the early from industrial psychology and principles, task analysis evolved as a core tool in human factors to address the interplay between humans and technology, with early influences from figures like Frederick W. Taylor's time-and-motion studies and the psychotechnics movement. By the mid-, it expanded to incorporate cognitive elements amid growing system complexity, as seen in post-World War II applications to and , and further refined in the 1980s–1990s through methods emphasizing knowledge elicitation from experts. Today, it remains foundational for analyzing both physical and mental demands in diverse contexts, from workplace to digital interfaces. Key methods include hierarchical task analysis (HTA), which structures tasks as a of goals, plans, and operations to map sequences and decision points; cognitive task analysis (CTA), focusing on unobservable mental processes like and problem-solving; and observational techniques such as or critical incident analysis. Applications span UX design, where it informs intuitive interfaces by streamlining user flows and reducing errors; , for enhancing workplace productivity and preventing injuries; and , for developing targeted training programs that align with skill acquisition needs. Overall, task analysis supports evidence-based improvements, ensuring systems accommodate human capabilities and limitations across industries like healthcare, aviation, and .

Fundamentals

Definition and Purpose

Task analysis is a systematic process of identifying, describing, and decomposing tasks into subtasks, steps, and elements to understand user , actions, and interactions within a given context. It involves examining what a person is required to do, in terms of both observable behaviors and underlying cognitive processes, to achieve a specific or . This foundational approach originates from human factors engineering and human-computer interaction, enabling a detailed representation of task structures to support effective design and evaluation. The primary purposes of task analysis are to improve system design by aligning interfaces and workflows with user needs, enhance through streamlined interactions, optimize by reducing inefficiencies, identify potential errors in task execution, and inform training requirements to build necessary skills. By breaking down complex activities, it helps allocate functions appropriately between humans and , assess workload demands, and mitigate risks in operational environments. These objectives ensure that interventions—such as product redesigns or procedural updates—directly address real-world task demands rather than assumptions about user behavior. Key components of task analysis include observable actions, such as physical manipulations or sequential operations; mental processes, like or problem-solving; environmental factors, including contextual constraints like spatial layouts or concurrent demands; and tools or resources, such as equipment or interfaces that facilitate task completion. A core distinction exists between goal-directed tasks, which emphasize user intentions and higher-level objectives, and procedural tasks, which focus on the sequential steps required to fulfill those intentions. This differentiation allows analysts to capture both the strategic intent behind a task and the tactical execution needed to realize it.

Historical Development

Task analysis developed significantly during in the context of military training and early human factors engineering, building on earlier foundations in industrial psychology and from the early 20th century, including time-and-motion studies by Frederick W. Taylor and the Gilbreths. As the U.S. military faced the need to rapidly train large numbers of personnel for , operation, and other technical roles, systematic methods for breaking down tasks into component skills emerged to improve efficiency and reduce errors. This work was influenced by early approaches, including time-motion studies from the scientific management era, but adapted to wartime demands for human-machine interfaces. For instance, psychologists like those in the Army Air Forces' aviation psychology program conducted analyses to identify cognitive and perceptual requirements, laying foundational practices for modern task analysis. In the 1960s and 1970s, task analysis expanded within , particularly through hierarchical methods developed for design and simulation. John Annett and Keith Duncan formalized hierarchical task analysis (HTA) in 1967 as a structured approach to decompose tasks into goals, sub-goals, and operations, initially aimed at assessing needs in and contexts. This period saw broader integration into ergonomic practices, with applications in simulations to model operator behaviors and predict performance, building on post-war advancements in and . Annett's subsequent works, such as Annett et al. (1971), emphasized goal-directed hierarchies, influencing standardized techniques across human factors disciplines. The 1980s and 1990s marked the integration of task analysis into human-computer interaction (HCI), with seminal contributions like the model introduced by Stuart Card, , and Allen Newell in their 1983 book The Psychology of Human-Computer Interaction. provided a predictive framework for analyzing skilled user performance by specifying goals, operators, methods, and selection rules, enabling quantitative evaluations of interface efficiency. This era also saw the publication of A Guide to Task Analysis by Barry Kirwan and Les K. Ainsworth in 1992, which standardized diverse task analysis techniques for system design and risk assessment, becoming a key reference in and . From the onward, task analysis gained prominence in (UX) design and international safety standards, reflecting its evolution into a versatile tool for digital interfaces and human-centered systems. Frameworks from the , such as those outlined in their usability guidelines, adapted task analysis for web and to prioritize user goals and workflows. Concurrently, standards like , first published in 1992 (with key parts like ISO 9241-11 in 1998) and revised through the , incorporated task analysis principles for ergonomic requirements in human-system interaction, emphasizing and efficiency in office and control environments. These developments broadened task analysis beyond military and industrial roots to contemporary applications in safety-critical domains.

Types of Task Analysis

Hierarchical Task Analysis

Hierarchical Task Analysis (HTA) is a structured in human factors and for decomposing complex tasks into a of goals, subgoals, operations, and plans, enabling a clear representation of task structure for purposes such as training, design, and system evaluation. Developed in the late 1960s by John Annett and colleagues at the , HTA originated from efforts to analyze operator tasks in industrial and process-control settings, emphasizing a theory of performance based on goal-directed behavior. The core process of HTA starts with defining the main task goal at level 0, then recursively breaking it down into subordinate subtasks across multiple levels until reaching atomic operations—the indivisible actions that cannot be further decomposed. These operations form the lowest level, while plans articulate the rules for sequencing and coordinating subtasks, incorporating conditions such as "do subtask 1 then subtask 2," "do 1 and 2 concurrently," or "do 1 or 2 depending on outcome." This hierarchical decomposition ensures the analysis captures both the "what" (goals and operations) and the "how" (plans) of task execution, with stopping criteria guided by the P × C rule—balancing the analysis's purpose against its completeness to avoid unnecessary detail. Notation in HTA commonly uses numbering for (e.g., 0 for the main , 1.1 for a subtask under goal 1, 2.1.3 for an ), paired with textual plans that specify execution using symbols like → (then), + (and), or / (or). Representations can take the form of diagrams for visual overview, numbered lists for sequential detail, or tables to integrate plans with descriptions, facilitating communication among analysts, experts, and stakeholders. Conducting an HTA involves several iterative steps: first, observe or perform the task to gather data on activities and constraints; second, and the main goal and initial subtasks; third, refine the by identifying operations and drafting plans; fourth, validate the structure through review with subject matter experts; and finally, iterate for completeness and fit to purpose, potentially using multiple data sources like interviews or protocols. HTA offers advantages such as visual clarity in mapping intricate task flows, making it adaptable for applications in system design, error prediction, and function allocation across domains like healthcare and . Limitations include its potential to overlook unobservable cognitive processes—though extensions integrate such elements as in Cognitive Task Analysis—and challenges in handling highly dynamic or context-dependent tasks where plans may require frequent revision. A representative example is decomposing the task of booking a flight online:
  • 0. Book a flight
    • Plan 0: Do 1 → 2 → 3 (if valid selection in 2).
  • 1. Search for flights
    • 1.1. Enter search criteria (e.g., dates, destinations).
    • 1.2. Submit query and review results.
    • Plan 1: Do 1.1 → 1.2.
  • 2. Select flight
    • 2.1. Compare options.
    • 2.2. Choose and confirm.
    • Plan 2: Do 2.1 → 2.2 (or repeat 1 if no suitable option).
  • 3. Complete payment
    • 3.1. Enter details.
    • 3.2. Confirm and receive booking.
    • Plan 3: Do 3.1 → 3.2.
This hierarchy highlights conditional branching, such as re-searching if no flight matches criteria.

Cognitive Task Analysis

Cognitive task analysis (CTA) encompasses a suite of methods designed to elicit and model the unobservable cognitive elements of expert performance, including , processes, , and mental strategies that underlie complex tasks. Unlike analyses focused solely on observable behaviors, CTA targets the internal mental activities that enable skilled practitioners to perform effectively in dynamic environments. This approach draws from and human factors engineering to uncover how experts perceive cues, form mental models, and adapt strategies, often revealing expertise that is difficult to articulate explicitly. Key methods in CTA include knowledge audits, which involve structured interviews to systematically probe the declarative, procedural, and strategic required for a task; the Critical Decision Method (CDM), a technique that reconstructs during high-stakes incidents by applying targeted probes to explore options, cues, and goals; and concept mapping, which visually represents experts' mental models by diagramming relationships between concepts, goals, and knowledge structures. These methods are particularly effective for capturing nonlinear, context-dependent cognition in domains where expertise relies on and rapid judgment rather than rote procedures. For instance, CDM has been widely adopted to dissect under time pressure. Conducting a CTA typically involves several steps: first, identifying cognitive demands such as of environmental cues, loads, problem-solving heuristics, and metacognitive through initial observations or familiarization; second, employing probes like concurrent think-aloud protocols, where experts verbalize their thoughts in during task performance to reveal ongoing mental processes; and third, analyzing the collected for recurring patterns, such as common decision cues or gaps, often using thematic coding or representational tools to build integrated models. Think-aloud protocols, in particular, provide direct access to cognitive streams but require careful prompting to minimize disruption to natural performance. This iterative process ensures that models reflect authentic expert while complementing hierarchical task analysis by incorporating mental subprocesses into broader action sequences. In high-stakes domains, CTA has proven invaluable for modeling expert performance, such as in where it has informed pilot decision trees by elucidating how air traffic controllers integrate data, weather variables, and to manage conflicts. Similarly, in , CTA techniques like CDM have been applied to diagnostic reasoning, revealing how clinicians weigh symptoms, test results, and patient history to form hypotheses under uncertainty, thereby supporting the design of decision aids and training simulations. These applications highlight CTA's role in enhancing system reliability and expertise transfer in environments where errors can have severe consequences. Despite its strengths, CTA faces challenges including the inherent subjectivity in eliciting , as experts may struggle to verbalize intuitive processes or retrospectively reconstruct events accurately, potentially leading to incomplete or biased representations. To mitigate this, researchers emphasize —combining multiple methods like interviews with behavioral observations or eye-tracking data—to validate findings and reduce reliance on self-reports alone. These limitations underscore the need for skilled facilitators and rigorous validation to ensure the reliability of CTA-derived models.

Specialized Methods

GOMS Model

The model, developed by Stuart K. Card, Thomas P. Moran, and Allen Newell, serves as a predictive for evaluating the efficiency of user interactions with computer interfaces by modeling skilled performance on routine tasks. It decomposes tasks into cognitive and motor components to estimate execution times without requiring empirical user testing, enabling designers to compare interface alternatives quantitatively. The model assumes users are experts performing error-free actions in familiar environments, focusing on rather than learning or problem-solving. At its core, GOMS consists of four interrelated elements: goals, which represent the user's high-level intentions (e.g., "delete a file"); operators, the basic perceptual, motor, and cognitive actions that execute steps (e.g., a keystroke or ); methods, hierarchical procedures or sequences of subgoals and operators to achieve a goal; and selection rules, decision heuristics that guide the choice among alternative methods based on interface constraints or user preferences. These components form a structured representation of , allowing analysts to simulate task performance as a series of goal expansions and executions. Several variants of the model exist, tailored to different levels of analysis and task types. The Keystroke-Level Model (KLM-) simplifies predictions for low-level motor actions in text-based or pointing interfaces, focusing solely on operator sequences without explicit goal hierarchies. In contrast, the Natural GOMS Language (NGOMSL) provides a more detailed, programming-like notation for modeling cognitive processes in linguistic or knowledge-based tasks, incorporating operators for perception, , and to predict both execution and learning times. KLM-GOMS estimates total task time by summing the durations of individual operators, with mental preparation times added according to specific rules (e.g., between major physical acts unless the action follows immediately from the prior one). The basic equation is: Total time = ∑ (operator times) + ∑ (mental preparation times), where mental preparation (typically 1.35 seconds) is inserted judiciously, and a shorter mental unit (M, 0.2 seconds) accounts for routine cognitive transitions. Standard empirical times for key operators, derived from laboratory studies of skilled typists and users, include K = 0.6 seconds for a home-row keystroke and P = 1.1 seconds for pointing to a small target with a (adjusted via Fitts' law for distance and size). Other common operators are B = 0.1 seconds for button press/release and H = 0.4 seconds for homing hands to the . These values enable rapid predictions, often accurate within 20-30% of observed expert performance. To apply the model, analysts first decompose the interface task into goals and methods, identifying applicable operators and selection rules; they then simulate the execution sequence, inserting mental operators as needed, and compute the total time to assess design efficiency. For instance, in evaluating text editing interfaces, KLM-GOMS might predict that menu-based navigation to delete a word takes approximately 4.75 seconds (sequence: P to menu + B to open + K for command + M + H + K to confirm), while a direct command-line entry requires only 2.65 seconds (K for command + M + K for word + B to execute), highlighting the command-line's advantage for expert users. This quantitative comparison informs improvements, such as streamlining menus or adding keyboard shortcuts.

Safety-Critical Task Analysis

Safety-critical task analysis (SCTA) adapts traditional task analysis methods to high-risk domains such as , , and chemical processing, focusing on identifying human actions that could initiate, exacerbate, or fail to mitigate major accidents. It emphasizes proactive by dissecting tasks to uncover potential error modes and their consequences, ensuring that aligns with objectives in environments where failures can lead to catastrophic outcomes. A key aspect of SCTA involves integrating task analysis with models to map operational sequences to potential failure modes. For instance, Human Error Templates (HETs) provide a checklist-based framework to predict errors during procedures, such as operations, by linking task elements to psychological error mechanisms like failures or gaps, thereby informing redesigns of standard operational procedures. In contexts, the Technique for Human Error Rate Prediction (THERP) combines task decomposition with probabilistic modeling to quantify error likelihoods, incorporating dependencies between operators to assess system-level risks. Core methods in SCTA include hierarchical task analysis (HTA) extended to incorporate error modes and probabilities. HTA breaks down tasks into hierarchical goals and sub-operations, after which error analysis identifies slips (unintentional execution errors, e.g., selecting the wrong control), lapses (omissions due to memory failures, e.g., skipping a step), and mistakes (planning or diagnostic errors, e.g., misinterpreting alarms). These are quantified using techniques like THERP, which assigns nominal human error probabilities (e.g., 0.001 for well-labeled controls under low stress) adjusted by performance shaping factors such as time pressure or training levels, often modeled with lognormal distributions for uncertainty (error factor of 3). This extension allows for , distinguishing SCTA from non-quantitative task analyses by enabling prediction of failure rates in dynamic scenarios. The process typically follows structured steps: first, critical paths are identified by reviewing site hazards and prioritizing tasks based on their potential to affect major accident prevention or mitigation; second, cognitive and physical demands are assessed through observation, interviews, and HTA, incorporating general insights from cognitive task analysis on mental workload; third, barriers are recommended, such as procedural checklists or automated interlocks, to reduce error probabilities and enhance recovery opportunities. These steps ensure comprehensive coverage of human actions in safety systems. SCTA aligns with regulatory standards to promote standardized application. In , the (FAA) incorporates task analysis within human factors guidelines for , requiring decomposition of critical tasks to evaluate operator interfaces and error risks in flight operations. For nuclear facilities, the (IAEA) integrates SCTA-like methods into analysis for probabilistic safety assessments, emphasizing procedural adherence. A notable example is the post-accident analysis of the 1986 , where IAEA investigations using INSAG-7 revealed procedural gaps—such as inadequate procedures for reactor tests and violations of safety protocols—that amplified human errors, leading to recommendations for enhanced error-mode mapping in operational guidelines. Despite its strengths, SCTA has limitations, including an overreliance on hindsight from past incidents, which may overlook novel risks, and the challenge of incorporating on dynamic , leading to uncertainties in probability estimates. These issues underscore the need for ongoing validation through simulations and field studies to refine its predictive accuracy.

Applications

In Human-Computer Interaction and UX Design

In human-computer interaction (HCI) and (UX) design, task analysis serves as a foundational method for understanding and optimizing user interactions with digital interfaces, enabling designers to map out and pinpoint inefficiencies that disrupt goal achievement. By decomposing complex user goals—such as booking a flight or completing an online purchase—into sequential tasks and subtasks, it reveals pain points like redundant steps or unclear navigation, which can lead to user frustration and abandonment. This approach is often integrated with Jakob Nielsen's heuristic evaluations, where evaluators assess task flows against established usability principles to identify violations, such as inconsistent interface elements that hinder progress in a user journey. Key techniques in this domain include scenario-based analysis, which constructs realistic narratives to simulate user contexts and break down tasks into observable actions, facilitating early identification of usability barriers during prototyping. Complementing this, affinity diagramming organizes subtasks derived from task analysis into thematic clusters, such as grouping navigation challenges or input errors, to inform the creation of intuitive wireframes and iterative designs. These methods support user-centered development by aligning interface elements with natural user workflows, often drawing briefly on predictive models like to forecast task efficiency without extensive testing. The primary benefits of task analysis in HCI and UX include reducing by streamlining task sequences and eliminating unnecessary mental effort, which in turn boosts task completion rates and overall user satisfaction. For instance, in checkout optimization, Baymard Institute's extensive research across 325 sites identified 32 common UX improvements per checkout flow, such as simplifying form fields and clarifying progress indicators, leading to an average 35% uplift in conversion rates by addressing task abandonment triggers. This demonstrates how targeted task analysis transforms high-friction digital experiences into efficient ones, minimizing user errors and enhancing engagement. To quantify these impacts, UX practitioners measure outcomes through prototypes using metrics like task success rate—the percentage of users who fully complete a predefined task, often benchmarked with confidence intervals (e.g., 20% complete success among tested participants)—time-on-task, which tracks duration to assess efficiency, and error frequency, counting deviations such as incorrect inputs or . These indicators, derived from moderated sessions, provide actionable data to refine designs, ensuring interfaces support seamless task execution while prioritizing conceptual usability over exhaustive benchmarks.

In Ergonomics and Safety Engineering

In , task analysis is employed to dissect work activities into sequential steps, enabling the evaluation of physical demands such as postures and repetitive motions to optimize designs and mitigate musculoskeletal disorders (MSDs). By breaking down tasks, ergonomists identify factors like awkward trunk bending or prolonged arm extension, which can lead to strain injuries in industrial settings. This approach facilitates targeted interventions, such as adjustable work surfaces or tool modifications, to align tasks with human capabilities and reduce fatigue. A key integration involves combining task breakdowns with observational tools like the Rapid Upper Limb Assessment (RULA) and Rapid Entire Body Assessment (), which score postural loads during specific task elements to inform workstation redesigns. For instance, in manufacturing environments, is applied post-task analysis to quantify whole-body risks from repetitive motions, such as lifting components, revealing high-risk scores that prompt ergonomic adjustments like height-variable platforms. These methods have been validated in various industries, including operations, where they highlight the need for balanced task sequencing to prevent cumulative trauma. In , task analysis underpins (JSA), a systematic process that decomposes jobs into steps to identify potential hazards, including environmental risks like falls in . For example, analyzing a task involving elevated platform access might uncover slip hazards from debris accumulation, leading to controls such as non-slip flooring or guardrails. JSA emphasizes the interplay between worker actions, tools, and surroundings, ensuring hazards are addressed before incidents occur, particularly in high-risk sectors like and . A notable application appears in automotive assembly lines, where task analysis-driven ergonomic interventions have significantly lowered incidence. In a pre-post study of a plant, workstation redesigns and tool modifications—identified through detailed task evaluations—reduced self-reported symptoms in shoulders and arms, alongside decreasing high-risk postural factors across operations. Such outcomes underscore the role of task analysis in reallocating physical loads to enhance worker and . Regulatory frameworks, such as guidelines from the (OSHA), recommend task-based risk assessments to help comply with ergonomic and safety standards, particularly under the General Duty Clause requiring hazard-free workplaces. OSHA guidelines recommend ergonomic job hazard analyses for jobs with repetitive or forceful tasks, integrating them into broader safety programs to prevent MSDs through proactive evaluations and controls. This ties into safety-critical task analysis by briefly noting error-prone steps in physical tasks, like improper lifting sequences, which amplify injury risks if unaddressed.

In Training and Education

Task analysis serves a pivotal role in training and education by decomposing complex tasks into discrete, learnable units that underpin and instructional sequencing. This process ensures that learning objectives are structured progressively, aligning subtasks with the cognitive domains of —from foundational recall and to advanced application, , , and —to foster comprehensive skill acquisition. For instance, educators use task analysis to classify outcomes and design instruction that matches these levels, such as breaking down problem-solving tasks to build from basic knowledge to evaluative judgment, thereby enhancing instructional efficacy across diverse educational contexts. In simulation-based , hierarchical task analysis (HTA) structures realistic scenarios to promote deliberate practice and competency building, particularly in high-stakes fields like healthcare. HTA deconstructs procedures into hierarchical steps and conditions, enabling the creation of validated simulations for tasks such as anaesthesia preparation or endotracheal suctioning, which support fluency and assessment. Complementing this, competency mapping employs task analysis to identify essential knowledge, skills, and behaviors required for , linking job duties to proficiency levels (e.g., basic communication for safety application) and guiding targeted development plans to bridge gaps in trainee capabilities. Cognitive task analysis briefly informs skill acquisition by revealing underlying mental processes, aiding the design of that addresses perceptual and elements. Evaluation of instructional effectiveness relies on pre- and post-training metrics assessing across task steps, providing quantifiable insights into learning gains. In medical procedure training, for example, HTA-derived of patient transfers into measurable steps has reduced errors through interventions, with team success rates improving from 34% to 86% (a 52% mean gain) and sustained clinical outcomes showing over 50% fewer injuries. These metrics, often tracked via expert-rated checklists, validate training impacts while informing iterative refinements. Adaptations of task analysis accommodate diverse learners, including those with disabilities, by tailoring step sequences and supports to individual needs, thereby promoting and in educational settings. Methods include formatting analyses with visual aids, electronic interfaces, or self-monitoring prompts; for instance, students with intellectual disabilities or use visually supported task lists for science inquiries, enabling independent participation aligned with grade-level standards and individualized education plans. This approach facilitates progress monitoring at the subtask level, enhancing of skills across contexts.

Comparisons

Versus Work Domain Analysis

Work Domain Analysis (WDA) is a foundational phase of Cognitive Work Analysis that employs Rasmussen's to model the structure of a work domain at multiple levels of abstraction, including functional purpose (why the system exists), abstract function (underlying laws and principles), generalized function (what the system does), physical function (how components operate), and physical form (devices and their configurations). This approach identifies the inherent constraints, affordances, and trade-offs within the domain without reference to specific tasks, actors, or events, providing a domain-general representation of the system's possibilities. In contrast, task analysis focuses on specific, goal-oriented sequences of actions required to achieve particular objectives, emphasizing procedural steps, , and event-dependent control tasks. Key differences lie in their levels of abstraction and scope: task analysis is concrete and time-bound, detailing how operators navigate from current to desired states through targeted behaviors, whereas WDA operates at a higher, structural level, exploring the full range of potential functions and interactions independent of any prescribed goals or scenarios. This makes task analysis inherently task-specific and prescriptive, while WDA is descriptive and generative, highlighting systemic affordances that enable or constrain work. Task analysis is particularly suited for optimizing well-defined procedures, such as streamlining user interactions in software interfaces or refining operator workflows in routine operations. WDA, however, excels in scenarios requiring ecological interface design, where interfaces must support adaptive performance across varied conditions, as seen in nuclear control rooms that display multi-level system states to reveal trade-offs in safety and efficiency. The two methods complement each other effectively in cognitive engineering, with WDA providing the overarching contextual frame for the work domain and task analysis specifying actionable paths within that frame, leading to more robust system designs that balance structure and flexibility.

Versus Cognitive Work Analysis

Cognitive Work Analysis (CWA) is a for modeling complex sociotechnical systems, emphasizing the analysis of work constraints, variability in task performance, and coordination among multiple actors in ill-defined or dynamic environments. It comprises several interconnected phases, including Work Domain Analysis (WDA), which maps the functional purposes, physical objects, and constraints of the work domain; Control Task Analysis, which examines the control tasks across different abstraction levels; and Strategies Analysis, which explores the range of strategies actors might employ to achieve goals under varying conditions. These phases collectively address and emergent behaviors, making CWA particularly suited for domains where tasks are not rigidly prescribed but adapt to situational demands. In contrast to task analysis, which decomposes observable, goal-directed activities into hierarchical steps assuming structured and predictable objectives, CWA prioritizes the inherent uncertainties and multiple possible paths in complex systems. Traditional task analysis, such as hierarchical task analysis, focuses on a single, normative way to perform routine tasks, often overlooking variability and interactions among team members. CWA, however, accommodates ill-structured problems involving multiple actors, emergent strategies, and environmental uncertainties, as exemplified in , where controllers must coordinate dynamically to manage unpredictable conflicts. Task analysis excels in optimizing efficiency for well-defined, routine procedures, such as user interface design for repetitive interactions, by providing clear, prescriptive models that minimize errors in stable contexts. Conversely, CWA's strength lies in fostering resilient system designs for dynamic, high-stakes environments, where it supports the identification of bounded flexibility to enhance adaptability and safety amid variability. This complementary nature allows both approaches to inform design, with task analysis handling predictable elements and CWA addressing the broader sociotechnical complexities. CWA emerged in the 1990s as an extension of traditional task analysis methods, developed by Kim Vicente and Jens Rasmussen to better handle the cognitive demands of advanced technology in complex systems. Building on Rasmussen's earlier work in cognitive systems engineering, Vicente's 1999 synthesis formalized CWA as a prescriptive tool for designing supportive technologies in uncertain domains.

References

  1. [1]
    Task Analysis: Support Users in Achieving Their Goals - NN/G
    Sep 20, 2020 · Task analysis refers to the broad practice of learning about how users work (i.e., the tasks they perform) to achieve their goals. Task analysis ...Missing: psychology authoritative sources
  2. [2]
    Task Analysis | Usability Body of Knowledge
    Task analysis identifies the actions and cognitive processes required for a user to complete a task or achieve a particular goal.Missing: psychology | Show results with:psychology
  3. [3]
    Task Analysis: How to Develop an Understanding of Work
    This guide will be useful to anyone who must understand work in order to develop personnel selection criteria, training objectives, or recommendations.Missing: definition HCI authoritative
  4. [4]
  5. [5]
  6. [6]
  7. [7]
    What is Task Analysis?
    ### Summary of Task Analysis
  8. [8]
    Task Analysis (Chapter 11) - The Cambridge Handbook of Expertise ...
    “Task analysis” may be defined as what a person is required to do, in terms of actions and/or cognitive processes, to achieve a system goal (cf. Kirwan & ...<|control11|><|separator|>
  9. [9]
    [PDF] BRIEFING NOTE No. 11 - Task analysis - HUMAN FACTORS
    Task analysis provides a wide range of practical and systematic methods for collecting data on tasks which, used together, can help to: • Define selection ...
  10. [10]
    Task Analysis (TA) - The Applied Cognitive Science Lab
    Task analysis has two key roles in human factors. to enable us to describe and understand behaviour - especially given that actual and supposed behaviour in ...Missing: engineering | Show results with:engineering
  11. [11]
    A History of Instructional Design and Technology: Part II - jstor
    The origins of instructional design procedures have been traced to World War II (Dick, 1987). During the war, a large number of psychologists and educators who ...
  12. [12]
    [PDF] Hierarchical Task Analysis: Developments, Applications and ...
    Hierarchical Task Analysis (HTA) is a core ergonomics approach based on a theory of performance, used to represent a system sub-goal hierarchy.<|control11|><|separator|>
  13. [13]
    [PDF] GOMS Models - Simplified Cognitive Architectures
    Proposed by Card, Moran, & Newell (1983). Goals - what goals can be accomplished with the system. Operators - what basic actions can be performed. Methods - ...
  14. [14]
    A Guide To Task Analysis | B Kirwan, - Taylor & Francis eBooks
    Sep 9, 1992 · This work shows readers how to target task analysis TA resources effectively over the life cycle of a project from conceptual design Through ...
  15. [15]
    Hierarchical Task Analysis (HTA) | Handbook of Human Factors and
    Hierarchical task analysis (HTA) was developed at the University of Hull in response to the need to analyze complex tasks, such as those found in the ...Missing: seminal sources
  16. [16]
    (PDF) Hierarchical task analysis: Developments, applications, and ...
    Aug 6, 2025 · Hierarchical task analysis (HTA) is a core ergonomics approach with a pedigree of over 30 years continuous use.Missing: seminal | Show results with:seminal
  17. [17]
    Hierarchical Task Analysis - Digital Healthcare Research
    A hierarchical task analysis (HTA) describes an activity in terms of its specific goals, subgoals, operations, and plans.Missing: seminal sources
  18. [18]
  19. [19]
    Cognitive Task Analysis | Jan Maarten Schraagen, Susan F ...
    Jun 1, 2000 · Cognitive task analysis is a broad area consisting of tools and techniques for describing the knowledge and strategies required for task ...Missing: definition | Show results with:definition<|separator|>
  20. [20]
    Critical decision method for eliciting knowledge - IEEE Xplore
    Abstract: A critical decision method is described for modeling tasks in naturalistic environments characterized by high time pressure, high information ...
  21. [21]
    Applied Cognitive Task Analysis in Aviation - Taylor & Francis eBooks
    Mar 2, 2017 · The book focuses on cognitive psychology and artificial intelligence analyses of aviation tasks. It is designed to help readers identify and ...Missing: medicine | Show results with:medicine<|separator|>
  22. [22]
    The use of cognitive task analysis in clinical and health services ...
    Mar 8, 2022 · Cognitive task analysis (CTA) methods are used to elicit, document and transfer tacit knowledge about how experts make decisions.
  23. [23]
    Cognitive Task Analysis - Oxford Academic - Oxford University Press
    A history of CTA is presented in Perspectives on cognitive task analysis: Historical origins and modern communities of practice by Hoffman and Militello (2008).
  24. [24]
    [PDF] GOMS Models for Task Analysis
    The GOMS models originally presented in Card, Moran, and Newell (1983) and subsequently dealt only with error-free behavior, although Card et al (1983, p. 184ff) ...
  25. [25]
    Safety Critical Task Analysis (SCTA) - Human Reliability Associates
    SCTA is a proactive approach to identifying and managing Human Factors risks. It is required by sites that fall under the COMAH regulations in the UK.
  26. [26]
    The Application of Human Error Template (HET) for Redesigning ...
    Human Error Template (HET) is a checklist style approach to predict human errors in the cockpit for developing accident prevention strategies.
  27. [27]
    [PDF] NUREG/CR-1278, "Handbook of Human Reliability Analysis with ...
    NOTICE. This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United.
  28. [28]
    Hierarchical Task Analysis (HTA) - Human Reliability Associates
    HTA is a method for analysing tasks using a hierarchical structure. You start with a goal for what the task is trying to accomplish overall.
  29. [29]
    [PDF] Guidance on human factors critical task analysis - IChemE
    Human factors critical task analysis involves identifying main hazards, manual activities, key steps, potential failures, and managing failures using a ...
  30. [30]
    [PDF] AC 460.15-1 - Advisory Circular
    Aug 12, 2024 · The vehicle instrumentation's ability to display safety critical information should be verified by inspection, analysis, and testing to show ...
  31. [31]
    [PDF] The Chernobyl Accident: Updating of INSAG-1
    In August 1986, an analysis of the accident was performed using an integrated model. This analysis formed the basis of the USSR's report to the IAEA. This ...
  32. [32]
    Heuristic Evaluations: How to Conduct - NN/G
    Jun 25, 2023 · A heuristic evaluation is a method for identifying design problems in a user interface. Evaluators judge the design against a set of guidelines ...
  33. [33]
    Turn User Goals into Task Scenarios for Usability Testing
    Jan 12, 2014 · Guidelines for usability test tasks: engage participants by writing task scenarios that are realistic, encourage an action, and don't give away how the ...Missing: analysis | Show results with:analysis
  34. [34]
    Affinity Diagramming: Collaboratively Sort UX Findings & Design Ideas
    Apr 26, 2024 · Affinity diagramming is a flexible and collaborative UX mapping activity that helps teams organize ideas or facts.Missing: subtasks task
  35. [35]
    E-Commerce Checkout Usability: An Original Research Study
    We've tested and re-tested checkout usability, running large-scale qualitative research studies testing the checkout flows of the world's leading e-commerce ...Missing: task | Show results with:task
  36. [36]
    Success Rate: The Simplest Usability Metric - NN/G
    Jul 20, 2021 · You can measure users' ability to complete tasks. Success rates are easy to understand and represent the UX bottom line.Missing: prototypes | Show results with:prototypes
  37. [37]
    Ergonomics - Identify Problems | Occupational Safety and Health Administration
    ### OSHA Recommendations on Ergonomic Job Hazard Analysis for Task-Based Risk Assessments
  38. [38]
    An Overview of REBA Method Applications in the World - PMC
    Apr 12, 2020 · The objective of this work is to review literature, worldwide, in which the Rapid Entire Body Assessment (REBA) ergonomic assessment method was applied.
  39. [39]
    [PDF] Job Hazard Analysis - OSHA
    A job hazard analysis is a technique that focuses on job tasks as a way to identify hazards before they occur. ... hazards and achieve a high level of worker ...
  40. [40]
    None
    Below is a merged response that consolidates all the information from the provided summaries into a single, comprehensive overview. To maximize detail and clarity while retaining all information, I’ve organized key elements into tables where appropriate (e.g., for processes, methods, and references) and supplemented with narrative text for context and alignment with Bloom's Taxonomy. The response avoids redundancy while ensuring all segments are fully represented.
  41. [41]
    [PDF] How to… complete a hierarchical task analysis - Journal of ...
    The outputs of HTA can be used in simulation-based education for assessment and the validation of training and assessment. Key points. • Hierarchical task ...
  42. [42]
    [PDF] Competency Matrix Development & Usage Guide - GIZ
    Competency profiling is the process of selecting industry relevant competencies at a specific level of proficiency required for successful performance in a job.
  43. [43]
    [PDF] Cognitive Task Analysis–Based Training: A Meta-Analysis of Studies
    Subject matter analysis emphasizes the exami- nation of the structural nature of knowledge and the ways in which relevant concepts are related to one another ( ...Missing: subjectivity | Show results with:subjectivity
  44. [44]
    [PDF] DEVELOPMENT OF AN OPTIMAL PATIENT TRANSFER TASK SET ...
    Sep 14, 2009 · Systematic error reduction and prediction processes were used to demonstrate how the task analysis method can be used to substantially reduce ...
  45. [45]
    [PDF] Using Task Analysis to Support Inclusion and Assessment in the ...
    A task analysis is a sequenced list of subtasks. It supports students with ESN in general education, and can be used for both learning and assessment.Missing: diverse | Show results with:diverse
  46. [46]
  47. [47]
    A Theoretical Note on the Relationship Between Work Domain ...
    Aug 10, 2025 · Work domain analysis (WDA) is done to elicit a functional structure of the system under examination as the first step of cognitive work analysis ...
  48. [48]
    [PDF] Work Domain Analysis: Theoretical Concepts and Methodology - DTIC
    Rasmussen's five levels of abstraction. 5.2 Labels for the Levels of ... abstraction hierarchy provides a complete representation of the work domain.
  49. [49]
    [PDF] Cognitive Work Analysis - University of Washington
    Task analysis in work domain terms. What is the task (e.g., design of navigation functionality)? What are the goals of the task that generated an information ...
  50. [50]
    Cognitive Work Analysis | Toward Safe, Productive, and Healthy ...
    Apr 1, 1999 · This book describes, for the first time in pedagogical form, an approach to computer-based work in complex sociotechnical systems developed over the last 30 ...
  51. [51]
    Task Analysis, Cognitive Task Analysis, Cognitive Work Analysis
    CTA, like TA, is incapable of dealing with unanticipated task demands. CWA has been introduced to deal with complex systems whose demands include unanticipated ...
  52. [52]
    Full article: Hierarchical task analysis vs. cognitive work analysis
    The key difference seems to be that HTA begins with the system goal and uses this as the basis for analysis; whereas CWA (or more correctly, 'activity analysis ...<|control11|><|separator|>
  53. [53]
    A Five-Phase CWA for Air Traffic Control: Example from a TRACON ...
    Aug 10, 2025 · This paper presents an example of a full Cognitive Work Analysis (CWA) on a TRACON Air Traffic Control Microworld (TRACON, 1991).
  54. [54]
    Cognitive work analysis: An influential legacy extending beyond ...
    This paper traces the evolution of cognitive work analysis (CWA), from its origins to its subsequent development and impact.