Fact-checked by Grok 2 weeks ago

Eye tracking

Eye tracking is a that measures and records the position and movement of the eyes to determine the point of or the motion of the eyes relative to the head, providing objective insights into visual , cognitive processes, and behavioral patterns. It primarily captures key metrics, such as fixations (brief pauses lasting 100–600 ms where visual is processed) and saccades (rapid, ballistic movements between fixations that last 20–80 ms), which together form scanpaths representing how individuals explore their visual environment. Modern systems often employ or near- light sources and high-resolution cameras to detect the corneal reflection and center, enabling precise computation of direction through geometric modeling. The foundations of eye tracking trace back to early 20th-century research on oculomotor control, with significant advancements in the 1970s through concepts like scanpaths proposed by Noton and Stark, which linked eye movements to cognitive representations of visual perception. Technological evolution has shifted from invasive methods, such as electro-oculography (measuring electrical potentials via electrodes) and scleral search coils (inductive coils embedded in contact lenses), to non-invasive video-based trackers introduced in the late 20th century. As of 2025, systems vary from stationary screen-based setups (with accuracies around 0.5° and sampling rates up to 2000 Hz) to mobile wearable glasses (typically 50–100 Hz), allowing real-time data collection in naturalistic settings while prioritizing user comfort and ethical considerations like informed consent; recent advances include 3D deflectometry for enhanced gaze accuracy and applications in diagnosing neurodegenerative diseases such as Parkinson's. Eye tracking finds broad applications across disciplines, including and for studying and , human-computer interaction for and interface design, and for analyzing visual preferences. In healthcare, it aids diagnostic training by identifying search errors in —where up to 30% of errors stem from overlooked abnormalities—and supports competency assessment through eye-movement modeling examples. Emerging uses extend to , , and assistive technologies for individuals with disabilities, with metrics like precision (variability in repeated measurements) and accuracy (deviation from true ) ensuring reliable interpretations.

Overview

Definition and Principles

Eye tracking is the process of measuring the motion of an eye relative to the head, capturing either the point of —where an is looking—or the motion of the eye itself. This technique relies on detecting anatomical features of the eye to estimate its position and orientation with high precision. The physiological basis of eye movements stems from the coordinated action of six per eye, which control rotation and position within the . These muscles—superior, inferior, medial, and lateral rectus, plus superior and inferior oblique—are innervated by three (oculomotor, trochlear, and abducens), enabling precise movements under neural control from brainstem nuclei, the , and cortical areas like the . Eye movements serve to direct the high-acuity fovea, the central of the packed with cones for detailed , toward points of interest, while , mediated by rods in the outer , provides broader detection of motion and low-light stimuli but with reduced resolution. In optical eye tracking, the primary principles involve illuminating the eye with and analyzing reflections to determine direction. The corneo-scleral limbus, the visible boundary between the transparent and the white , serves as a stable reference for estimating eye rotation relative to the head. Similarly, the center of the dark is tracked, as its position shifts with eye rotation; corneal reflections (glints) from sources are used to calibrate and compensate for head movements, leveraging the physics of off the curved corneal surface to compute the vector. These methods exploit the eye's to achieve sub-degree accuracy in . Key metrics derived from eye tracking data include fixations, saccades, and scanpaths, which quantify how is allocated during visual tasks. Fixations are stable periods of , typically lasting 100–350 milliseconds, during which the eyes remain relatively to allow detailed visual processing. Saccades are rapid, ballistic eye movements, reaching speeds up to 900 degrees per second, that reposition the from one fixation point to another. A scanpath represents the sequential pattern of fixations connected by intervening saccades, forming a that reveals the of visual exploration.

Types of Eye Movements

Eye movements encompass several distinct categories that enable visual exploration, stabilization, and processing. These movements are essential for directing the fovea toward objects of interest and maintaining stable vision during dynamic conditions. The primary types include , saccades, fixations, vergence, the vestibulo-ocular reflex (VOR), and blink-related movements, each characterized by unique kinematics and functional roles. Smooth pursuit involves slow, continuous eye rotations that track a smoothly moving visual , allowing the image to remain stabilized on the without interruption by rapid shifts. These movements typically achieve velocities of 30 to 100 degrees per second, with (the ratio of eye to target ) approaching 1.0 for targets moving at moderate speeds but declining at higher velocities to prevent saturation. Smooth pursuit is initiated after a brief of about 100-150 following target motion onset and relies on predictive mechanisms to anticipate target trajectory, ensuring accurate tracking over time. Saccades are rapid, ballistic eye movements that abruptly redirect from one point to another, facilitating shifts between visual or scenes. They last 20 to 200 ms and reach peak velocities up to 900 per second for larger amplitudes, following a relationship where velocity increases with saccade size up to a plateau. Subtypes include microsaccades, which are involuntary miniature saccades (amplitudes of 0.1-1 ) occurring during attempted fixation to counteract neural drift, and postsaccadic overshoots, where the eyes briefly exceed the before corrective adjustments, often seen in dynamic viewing tasks. Fixations represent stable pauses in eye position where the eyes remain relatively stationary, enabling detailed visual of the attended . These periods typically endure 100 to 600 ms, with average durations around 200-300 ms depending on task demands such as scene complexity or . During fixations, high-acuity foveal extracts critical information, and subtle drifts or tremors may occur, but the overall stability supports perceptual analysis. Vergence movements coordinate the inward () or outward () rotation of both eyes to align them on objects at varying depths, crucial for and . These disjunctive movements adjust the vergence angle based on cues, with peak velocities reaching 10-20 degrees per second and latencies of 150-200 ms for near targets. Vergence enhances by fusing slightly disparate retinal images, maintaining single vision across distances from near (e.g., 30 cm) to far (e.g., ). The vestibulo-ocular reflex (VOR) generates compensatory eye movements in the direction opposite to head rotation, stabilizing the visual world on the during passive or active head motions. This reflexive response operates with latencies under 15 ms and gains near 1.0 for head velocities up to 100-200 degrees per second, integrating vestibular signals from and otoliths. VOR ensures stability in everyday activities like walking or turning, with adaptations to maintain efficacy across frequencies from 0.1 to 10 Hz. Blink-related movements involve rapid eyelid closures that interrupt the visual stream, typically lasting 200 to 400 ms and occurring 10-20 times per minute under normal conditions. Blinks cause temporary data loss in eye tracking by occluding the and , leading to artifacts in position recordings that must be compensated through algorithms or event detection to reconstruct paths accurately. Compensation methods, such as velocity-threshold filtering or machine learning-based gap filling, preserve without introducing significant errors in subsequent analyses. These movement types collectively underpin applications like reading studies, where saccades and fixations reveal cognitive processing patterns during text comprehension.

History

Early Developments

The earliest observations of eye movements trace back to , where philosophers like (circa 384–322 BCE) described binocular coordination, distinguishing between conjugate version movements and vergence for . These insights, drawn from direct anatomical and perceptual studies, laid a conceptual foundation for understanding oculomotor behavior without technological aids. In the , systematic research on eye movements during reading emerged as a key focus. French ophthalmologist Louis Émile Javal conducted pioneering experiments in 1878–1879, observing that eyes do not glide smoothly across text but instead make rapid jumps, which he termed "saccades," occurring roughly once every 15–18 letters. Building on this, American psychologist Edward B. Delabarre developed one of the first mechanical eye trackers in 1898 at , employing a small plaster-of-Paris cup attached to the eyeball connected to a mirror and a rotating smoked drum to record horizontal movements during reading tasks. These innovations shifted studies from mere observation to quantifiable recording, though still reliant on invasive attachments. Early 20th-century advancements refined these techniques for psychological experimentation. Edmund Burke Huey, who around 1898–1900 at improved Delabarre's design with plaster-of-Paris eye-cups fitted to the , allowing precise mapping of fixation points and confirming the discontinuous nature of reading eye movements in his seminal book The and of Reading. This apparatus, often called the Huey eye tracker, was instrumental in experiments by researchers like George Malcolm Stratton, who in 1902 used early photographic methods to capture "darting" eye patterns during picture viewing, highlighting aesthetic and attentional influences on gaze. Huey's device enabled detailed analysis of fixation durations and lengths, influencing reading and . During , eye tracking saw its first military applications, particularly in for pilot training and fatigue assessment. Psychologists Joseph Tiffin and John Bromer, in studies from 1941 to 1943, employed motion-picture photography at 16 frames per second to record eye movements of 33 pilots during 177 Piper Cub J-3 landings, revealing differences in scan patterns between novices and experts to refine training protocols. Such efforts addressed wartime demands for improved pilot performance amid high accident rates. Despite these progresses, early eye tracking methods faced critical limitations, including mechanical inaccuracies that led to recording errors like overshoots and the invasiveness of eye-attached devices, which caused subject discomfort and restricted natural behavior. Photographic and mechanical approaches also demanded controlled lab settings, limiting . These challenges underscored the need for less intrusive technologies in subsequent decades.

Modern Advancements

In the mid-20th century, significant advancements in eye tracking emerged with the invention of scleral search coils by David A. Robinson in 1963, which enabled precise of eye movements using a small coil embedded in a placed in a to detect horizontal, vertical, and torsional rotations with sub-minute accuracy. Concurrently, in the , Alfred Yarbus developed early video-based systems that captured eye movements through close-up recordings of the eye, allowing manual frame-by-frame analysis to study and task-dependent gaze patterns in controlled experiments. The 1980s and 1990s marked the commercialization of optical trackers, with Applied Science Laboratories (ASL) pioneering video-based systems starting in the 1970s and expanding into widely adopted models like the Model 501 head-mounted tracker by the 1980s, which illuminated the eye with light to track position non-invasively. These systems integrated with personal computers during this period, facilitating real-time data analysis and enabling broader applications in and human-computer interaction research. From the 2000s onward, techniques revolutionized pupil detection, with convolutional neural networks (CNNs) introduced post-2010 for robust, automated identification of centers in challenging lighting conditions, as demonstrated in frameworks like PupilNet, which achieved high accuracy on diverse datasets without manual calibration. Mobile and webcam-based tracking proliferated, exemplified by Apple's 2017 integration of eye tracking in technology, which uses cameras and neural processing to map patterns for secure authentication while monitoring direction. Similarly, Tobii's eye control systems in the , such as the Eye Tracker 5, enabled hands-free computer navigation through interaction with Windows interfaces, supporting for users with motor impairments. Recent trends through 2025 have emphasized AI-enhanced accuracy in virtual and , with Meta's 2023 updates to the Quest Pro headset improving eye tracking resolution and field-of-view coverage to better support and social realism. Low-cost has democratized access, as seen in projects like OpenGaze, which provide smartphone-based estimation using off-the-shelf cameras and models for real-time tracking at minimal expense. Standardization efforts, such as ISO 15007:2020, have established metrics for measuring driver visual behavior in transport systems, including glance duration and total eyes-off-road time, to ensure consistent evaluation across devices. The shift from analog to digital processing, driven by advances in computing power and algorithms, has dramatically reduced costs, transforming eye tracking from specialized equipment priced in thousands of dollars to consumer devices and solutions available for under $100, thereby expanding its use in everyday applications like and .

Tracking Methods

Eye-Attached Tracking

Eye-attached tracking methods involve the physical attachment of devices directly to the eye, enabling exceptionally precise measurements of eye position and movement in settings. These invasive techniques prioritize sub-degree accuracy and high sampling rates, making them valuable for detailed neuroscientific investigations, though their use is constrained by participant comfort and ethical considerations. One prominent example is the scleral search coil system, which embeds small induction coils within a placed on the of the eye. The coils interact with alternating electromagnetic fields generated by surrounding field coils, inducing voltages proportional to the eye's rotational position in three dimensions. This method, pioneered by David A. Robinson in 1963, achieves angular accuracies better than 0.1 degrees and supports sampling rates up to 1 kHz, allowing capture of rapid eye movements like saccades and microsaccades. These techniques offer sub-minute angular accuracy and are largely immune to head movements when properly calibrated—search coils due to direct attachment—facilitating their application in for studying phenomena such as binocular vergence (eye alignment for near objects) and the vestibulo-ocular reflex (VOR, stabilizing gaze during head motion). For instance, scleral coils have been employed to quantify dynamic cyclovergence during head translations, revealing torsional eye adjustments on the order of 1-2 degrees. However, eye-attached methods like scleral search coils cause significant discomfort from lens insertion, often requiring , and carry risks of or , restricting sessions to under an hour. Neither is suitable for field or prolonged use, and compared to non-invasive alternatives like video-based systems, they impose greater setup complexity. Today, these methods remain confined to specialized , with scleral coils facing heightened ethical scrutiny and restrictions since the early 2000s due to their invasiveness, particularly in studies; applications have also declined with advances in non-contact technologies.

Optical Tracking

Optical tracking refers to non-invasive eye tracking methods that employ light, primarily in the near-infrared spectrum, to illuminate and capture key ocular features such as the and corneal reflections for . These techniques avoid physical contact with the eye, making them suitable for a wide range of and applied settings, from laboratory experiments to real-world interactions. Infrared illumination forms the basis of most optical systems, utilizing near-infrared light-emitting diodes (LEDs) to project light onto the eye, generating the first Purkinje image—a bright reflection on the corneal surface—while rendering the pupil dark against the illuminated iris. This setup facilitates the detection of eye position through algorithms that either track the center of the pupil for its positional changes or monitor the limbus, the junction between the iris and sclera, to infer rotational movements. Multiple IR LEDs can produce several corneal reflections (glints), with configurations using up to 12 glints enhancing robustness against occlusions like eyelids. A high-precision variant is the dual-Purkinje-image (DPI) tracker, which utilizes reflections from the eye's anterior surfaces—the first from the and the fourth from the posterior surface—to compute eye rotation relative to the head. Developed by Tom N. Cornsweet and Howard D. Crane in the 1973, this optical method delivers resolutions around 1 arcminute (approximately 0.017 degrees) and is designed for fixed-head laboratory use to minimize errors from head motion. DPI trackers offer sub-minute angular accuracy through differential reflection analysis and are largely immune to minor head movements when properly calibrated, facilitating their application in for studying micro-movements. However, they demand head immobilization via chin rests or bite bars, adding to participant fatigue, and impose greater setup complexity compared to standard video-based systems. Video oculography (VOG) is the predominant implementation of optical tracking, relying on high-speed cameras to record eye images for real-time analysis. Systems like the and SR Research EyeLink operate at frame rates from 60 Hz to 2000 Hz, capturing subtle movements such as saccades and fixations. Gaze estimation in VOG typically involves mapping, a regression-based approach that correlates detected features (e.g., center and glint positions) to screen coordinates or , often using second- or third-order polynomials for accuracy. Optical trackers differ in between remote and head-mounted designs. Remote systems, positioned below or near a display (e.g., desk-mounted or EyeLink units), provide sub-degree precision in stationary setups ideal for controlled studies, though they limit user mobility. In contrast, head-mounted trackers, such as the Pupil Labs glasses introduced in the , integrate lightweight cameras into wearable frames for unobtrusive tracking during natural behaviors like walking or , albeit with slightly reduced precision due to motion artifacts. Achieving reliable performance requires , often via a 9-point where participants fixate on targets across the to establish a personalized mapping between eye features and points. Factors affecting accuracy include pupil , which displaces the center and can introduce errors up to 0.2 degrees, and , which scatter IR light or block glints, reducing precision by similar margins; overall, well-calibrated systems yield typical angular accuracies of 0.5 to 1 degree. Post-2015 advancements have integrated for feature detection, enabling robust performance in challenging conditions like low light or head motion by training convolutional neural networks on diverse datasets. These models achieve detection rates over 95% and improve estimation precision, as seen in smartphone-based systems rivaling lab-grade without additional sensors.

Electrical Potential Measurement

Electrical potential measurement in eye tracking primarily relies on electrooculography (EOG), a bioelectric technique that detects eye movements by recording voltage changes associated with the rotation of the eyeball. EOG measures the corneo-retinal standing potential, a natural bioelectric field generated by the eye, where the is positively charged relative to the negatively charged , forming a -like structure. As the eye rotates, this dipole shifts, producing detectable voltage variations on the skin surface proportional to the , typically linear up to 15-30 degrees of horizontal or vertical movement. The principle exploits the steady-state potential difference of approximately 10-30 μV per degree of eye rotation, which requires amplification to usable levels for accurate tracking. In practice, silver-silver chloride (Ag-AgCl) electrodes are placed in a bipolar configuration: for horizontal movements, pairs are positioned at the outer canthi of each eye; for vertical movements, one electrode is placed above and another below the eye, with a often on the forehead or mastoid process to minimize noise. Sampling rates typically range from 50 to 500 Hz to capture dynamics, and involves having the subject perform known shifts (e.g., to targets at fixed angles) to map voltage outputs to directions. Impedance is kept below 25 kΩ to ensure signal quality. EOG offers several advantages, including low cost due to simple electrode-based hardware, functionality in complete darkness since it does not rely on light reflection, and no requirement for a direct line-of-sight to the eyes, making it suitable for unconstrained or head-mounted setups. These features have led to its adoption in applications such as sleep studies for detecting rapid eye movements () during non-REM staging and in assistive technologies, like EOG-controlled wheelchairs or communication devices for individuals with motor impairments. However, EOG has notable limitations, including a relatively low of approximately 0.5-1 degree, which is coarser than optical methods, and susceptibility to baseline drift over extended recording periods due to changes in skin-electrode interface or physiological factors. It is also sensitive to artifacts from blinks, eye muscle activity, and , such as 50/60 Hz , necessitating filtering techniques like filters for mains noise, (ICA), or transforms to isolate the eye signal. Despite these challenges, EOG remains valuable for scenarios prioritizing robustness over precision.

Data Acquisition and Analysis

Data Types and Collection

Eye trackers generate primarily in the form of timestamped samples capturing the of the (x, y coordinates) and estimated points on a reference surface, typically at rates ranging from 250 to 2000 Hz depending on the system. Event markers are included to denote detected eye movements such as blinks, which represent temporary occlusions of the , and fixations, periods of relatively stable . Additional , including size (often measured in pixels or normalized units) and head pose estimates (e.g., translation and rotation), provide context for interpreting direction and accounting for environmental variations. From these raw samples, derived metrics are computed to quantify oculomotor . Common examples include fixation duration, the time spent maintaining within a defined spatial threshold (typically 0.5–1 degree of ), and characteristics such as (the angular distance covered) and peak velocity. velocity and follow a stereotypical nonlinear relationship known as the , where peak velocity increases with up to a plateau around 500–600 degrees per second for amplitudes exceeding 20 degrees. Areas of interest (AOIs) on stimuli are analyzed for metrics like visit counts, representing the number of times enters a predefined region. Data collection begins with calibration protocols, where participants fixate on a series of known targets (e.g., 5–13 points across the visual field) to map pupil or corneal reflections to screen coordinates, ensuring gaze estimation accuracy below 1 degree of visual angle. Validation tests, such as re-presenting targets post-calibration, confirm precision (typically <0.5 degrees standard deviation) and handle artifacts like signal loss from occlusions (e.g., eyelids or eyelashes) or low infrared reflectance by interpolating missing samples or flagging invalid data. Sampling considerations influence data fidelity: temporal resolution determines the ability to capture fast saccades (requiring at least 250 Hz), while spatial resolution, often sub-pixel (e.g., 0.01 degrees), supports precise gaze mapping. Common output formats include binary files like EDF (EyeLink Data Format) for high-efficiency storage of samples, events, and messages, which can be exported to ASCII or CSV for analysis. Quality assurance involves preprocessing to mitigate noise from vibrations, lighting changes, or minor head movements. Techniques such as Savitzky-Golay filtering apply polynomial least-squares smoothing to raw gaze coordinates, preserving signal features while reducing high-frequency noise without excessive distortion. Outlier detection identifies invalid samples (e.g., via velocity thresholds exceeding physiological limits) for removal or interpolation, ensuring datasets meet criteria like >95% valid samples per .

Visualization and Presentation

Eye tracking data visualization transforms raw gaze metrics, such as fixations and saccades, into interpretable graphical representations that reveal patterns of on stimuli. These methods facilitate qualitative analysis by overlaying summaries onto images, videos, or 3D scenes, aiding researchers in understanding cognitive processes without delving into statistical computations. Common techniques prioritize clarity and scalability, often aggregating data from single or multiple participants to highlight areas of interest. Heatmaps are density-based visualizations that depict the frequency and duration of fixations across a stimulus using color gradients, where warmer colors like indicate higher and cooler tones like show lower activity. They are generated by convolving fixation points with a Gaussian to create smooth, continuous surfaces that account for the natural spread of , often with a standard deviation tuned to approximate foveal (e.g., 1-2 degrees). This approach, introduced in early eye tracking , excels in identifying salient regions for aggregate analysis, such as in testing where hotspots reveal focal points on buttons or text. Scanpath diagrams illustrate the sequential nature of eye movements by connecting fixation points with lines representing saccades, often using numbered circles or dots sized proportionally to fixation duration to convey temporal order and path efficiency. These diagrams emphasize individual or grouped trajectories, helping to trace exploratory behaviors like reading patterns or search strategies on complex visuals. For instance, in webpage analysis, lines might link fixations from headline to , numbered 1 to 5, revealing nonlinear scanning. Gaze plots overlay raw or simplified trajectories directly on stimulus images, using dots for fixations and lines or arrows for movements, with options for dynamic replays to animate the temporal progression of over time. This format supports detailed inspection of single trials, such as replaying a video stimulus to show how shifts frame-by-frame during a task. Unlike aggregated views, gaze plots preserve idiosyncrasies in individual paths, making them suitable for qualitative comparisons in cognitive studies. Specialized software streamlines these visualizations, with commercial tools like Pro Lab providing built-in heatmap and scanpath generation integrated with data export to formats such as or for further customization. Open-source alternatives, including OGAMA for slideshow-based experiments and PyGaze's Analyser module, enable free creation of gaze plots and heatmaps via scripting, supporting or remote trackers. These platforms often include replay functions and stimulus mapping, allowing users to iterate visualizations without proprietary hardware dependencies. Best practices for effective presentation include normalizing fixation densities relative to stimulus area or total gaze time to enable cross-subject comparisons, preventing biases from varying screen sizes or viewing distances. For multiple viewers, aggregate views like overlaid heatmaps should use or clustering to mitigate visual clutter, ensuring patterns emerge without obscuring underlying stimuli. Consistent color scales and resolution matching between data and visuals further enhance interpretability, as recommended in visualization guidelines for eye tracking .

Analytical Techniques

Analytical techniques in eye tracking involve computational methods to process raw gaze into quantifiable insights, enabling about cognitive such as and . These techniques typically begin with event detection to parse continuous gaze trajectories into discrete events like fixations and saccades, followed by the derivation of metrics and application of statistical models to interpret patterns. Advanced approaches incorporate dynamic modeling and multimodal integration, while validation ensures robustness across sessions and devices. Event detection algorithms classify eye movements by distinguishing stable fixations—periods of relatively stationary —from rapid saccades that redirect . A foundational method is velocity-threshold identification (I-VT), which identifies fixations as gaze samples where velocity falls below a predefined , typically around 30°/s, and saccades as the intervening high-velocity segments; this approach relies on temporal of gaze points and is widely used due to its simplicity and effectiveness in controlled settings. Another seminal technique is identification by dispersion-threshold (I-DT), which defines fixations based on spatial clustering of gaze points within a minimum radius, such as 1-2° of , over a duration of 100 , offering robustness to noise in low-sampling-rate data. Hybrid algorithms combining velocity and dispersion thresholds, like I-VDT, further improve accuracy by addressing limitations in noisy environments, achieving up to 95% agreement with manual labeling in benchmark evaluations. Key metrics quantify the efficiency and distribution of gaze patterns to infer attentional priorities. Scanpath efficiency measures how directly participants navigate visual space, often computed as the ratio of total scanpath length (sum of distances between consecutive fixations) to the ideal straight-line path to a , with lower ratios indicating more efficient search strategies in tasks like visual foraging. Interest maps derive from aggregating fixation locations to generate heatmaps of attentional hotspots, which are then compared against computational saliency models such as the Itti-Koch algorithm; this bottom-up model computes saliency via center-surround contrasts in color, , and features, predicting fixation probabilities with correlations up to 0.7 in natural scene viewing. These metrics prioritize conceptual insights, like deviations from saliency-driven paths signaling top-down influences, over exhaustive listings. Statistical models enable hypothesis testing and prediction from eye tracking data. Analysis of variance (ANOVA) assesses group differences in fixation durations, revealing, for instance, longer fixations ( 250-300 ms) in high-cognitive-load conditions compared to low-load baselines (F-values often exceeding 10 in experimental designs). models predict attentional allocation from features like pupil dilation and fixation count, with linear models explaining up to 40% variance in task performance via coefficients linking gaze entropy to engagement levels. approaches, such as support vector machines (SVM), classify from scanpath features (e.g., amplitude and curvature), achieving accuracies of 70-85% in multiclass settings by optimizing hyperplanes on high-dimensional gaze vectors. As of 2025, recent advancements have integrated techniques, such as convolutional neural networks (CNNs) for automated event detection and improved saliency prediction, enhancing accuracy in complex, real-world scenarios by analyzing spatiotemporal gaze patterns. Advanced dynamic models capture sequential dependencies in data for predictive inference. Hidden Markov models (HMMs), particularly the eye movement analysis with HMMs (EMHMM) framework, represent scanpaths as state transitions between fixation clusters, quantifying pattern consistency via model ; lower scores (e.g., <2 bits) indicate habitual strategies in repeated tasks. For multimodal analysis, coupled HMMs integrate eye tracking with EEG signals, synchronizing events with neural oscillations to model dynamics, improving classification of mental states by 15-20% over unimodal baselines. Validation of these techniques emphasizes reliability to handle device variability and session effects. Intraclass correlation coefficients (ICC) measure inter-session consistency, with values of 0.70-0.80 indicating good reproducibility for fixation-based metrics across repeated measures, though lower ICCs (0.50-0.60) occur for parameters due to influences. These assessments ensure analytical outputs remain stable, often visualized as overlaid scanpaths for qualitative corroboration.

Eye Tracking vs. Gaze Tracking

Eye tracking refers to the process of measuring the angular position and movement of the eyes relative to the head, typically by detecting features such as the or center using image-based techniques. In contrast, gaze tracking estimates the absolute direction of visual focus, or point of regard (), in world coordinates by computing the three-dimensional from eye position data combined with head . This distinction arises because eye tracking outputs data relative to the head's , such as rotations around the eye's , while gaze tracking requires fusion with head-tracking inputs to determine where in the external environment the eyes are directed. Technically, eye trackers primarily localize and track eye in two-dimensional images, often through , , or appearance-based methods without needing spatial beyond the face. tracking, however, employs model-based approaches—such as of the eye or techniques—or from calibrated eye positions, integrating head pose via inertial measurement units () in wearable devices like . For instance, remote optical systems may output raw eye-relative vectors, but achieving estimation demands additional processing to account for head movements, which can introduce cumulative errors of approximately 1-2 degrees in single-camera setups. These differences have significant accuracy implications: eye tracking alone provides precise relative measurements (often within 1 degree for controlled setups) but is insufficient for estimating on remote scenes or objects, as it ignores head orientation shifts. tracking enables applications like overlays by projecting the onto world coordinates, though it typically yields lower precision (0.5-3 degrees post-calibration) due to compounded errors from head tracking and eye modeling assumptions, such as refractive variations or head drift. Overall gaze accuracy can degrade further in dynamic environments, with offsets exceeding 1 degree at screen peripheries or under natural head motion. Despite these distinctions, substantial overlaps exist in terminology and implementation, with many commercial systems labeled as "eye trackers" incorporating gaze estimation capabilities through integrated head tracking. Historically, early eye tracking focused on relative eye movements in laboratory settings from the mid-20th century, but a shift toward integrated gaze systems occurred post-2000s, driven by advances in and non-intrusive hardware, leading to interchangeable usage in interactive applications. Eye tracking is best suited for controlled, head-fixed studies analyzing relative eye motions, such as saccades or fixations in reading tasks, whereas tracking is essential for real-world interactions requiring absolute , like driver monitoring or virtual interfaces.

Gaze-Contingent Techniques

Gaze-contingent techniques involve real-time modification of visual stimuli or system responses based on the observer's eye movements, enabling dynamic interaction between gaze data and the environment. Unlike non-contingent eye tracking, which passively records movements for post-hoc , these methods use immediate loops to alter displays or tasks, facilitating investigations into perceptual and cognitive processes. A prominent application is gaze-contingent display changes, such as foveated rendering in virtual reality, where high-resolution rendering is prioritized at the fixation point while peripheral areas receive lower detail to optimize computational efficiency. This mimics the human visual system's natural acuity gradient, reducing rendering load by up to 50% without perceptible quality loss in central vision. Seminal work demonstrated this through layered eccentricity-based mipmapping, achieving real-time performance on graphics hardware. In reading research, the gaze-contingent moving window paradigm limits visible text to a region around fixation, revealing the perceptual span extends asymmetrically—about 14 characters to the right and 4 to the left in English readers—informing models of word recognition and attention allocation. Key paradigms include the double-step saccade task, where a target shifts location mid-movement, testing parallel programming of eye movements and revealing that direction is specified early while amplitude adjusts later, with latencies around 200-250 ms for the second step. Another is the invisible boundary technique, which triggers a preview-to-target change upon crossing an unseen line during reading, isolating parafoveal processing effects; for instance, valid previews reduce fixation times by 30-50 ms compared to invalid ones, supporting models of lexical . These paradigms originated in foundational studies from the and , now standard for probing oculomotor control and linguistic . Implementation requires low-latency systems to ensure changes occur within 10 of detection, preventing artifacts like during . Commercial trackers like the EyeLink series achieve this via triggers for synchronized display updates, with end-to-end delays under 2 , enabling precise experiments in controlled lab settings. In research, these techniques study limits, such as through multiple object tracking where gaze-contingent masks reveal capacity constraints at 3-4 items, and via retro-cues that retroactively highlight probed locations post-encoding, boosting recall accuracy by 20-30% by prioritizing relevant representations in visual . They also support cognitive modeling, simulating fixation durations and trajectories to test theories of reading and scene perception. Challenges include minimizing lag from tracking and rendering pipelines, addressed by saccade prediction algorithms that forecast landing positions using velocity profiles and main sequence relations, improving accuracy to within 1 and enabling proactive display shifts. Ethical concerns arise with deceptive stimuli, such as mid-saccade changes that manipulate perceptions without awareness; experiments have shown these can bias decisions by altering option visibility at critical moments, raising issues of and potential psychological influence in interactive systems.

Applications

Commercial and Marketing Uses

Eye tracking plays a pivotal role in commercial and applications by providing objective on visual , enabling businesses to refine strategies for , , and user interfaces. In , it is widely used for shelf testing in environments to analyze how shoppers allocate their among products. Heatmaps generated from eye tracking reveal areas of high visual interest on store shelves, guiding optimal to maximize visibility and purchase likelihood. For instance, studies show that eye-level shelves attract more fixations, informing premium positioning decisions for packaged goods. In , eye tracking facilitates to evaluate which creative elements capture and sustain attention. By comparing fixation patterns between ad variants, marketers can identify designs that draw quicker initial glances and longer dwell times, improving campaign effectiveness. This approach has been integrated into online platforms for remote testing, allowing scalable assessment of ad performance across diverse audiences. For web and , eye tracking informs navigation analysis by mapping user gaze paths, often revealing an F-shaped scanning pattern where attention prioritizes the top and left regions of content-heavy pages. Research from the demonstrates that users skim web content in this manner, with horizontal fixations across the top followed by a vertical scan down the left, influencing layout decisions to place key elements in high-attention zones. Key metrics in these commercial contexts include time to first fixation, which measures the until a specific element receives initial , indicating its salience, and engagement ratios that quantify the proportion of total viewing time devoted to areas of interest. These metrics integrate with tools like to correlate visual data with behavioral outcomes, such as click-through rates. Notable case examples illustrate practical impacts. employed eye tracking in the 2010s for digital ad optimization, partnering with firms like Sticky to refine campaigns and reduce wasted spend on low-attention creatives by up to 25%. In automotive UX, Tobii's wearable eye trackers have been used to evaluate designs, measuring driver glance patterns in simulators to enhance ergonomics and minimize distraction. The eye tracking industry supporting these applications was valued at approximately $1.194 billion in 2023 and is projected to reach $7.253 billion by 2030, driven by demand in consumer insights. Leading providers include Tobii and SR Research, which supply hardware and software tailored for marketing research.

Healthcare and Assistive Technology

Eye tracking plays a crucial role in medical diagnostics by identifying abnormalities in oculomotor behavior associated with neurological disorders. In , patients often exhibit hypometric saccades, characterized by reduced amplitude and velocity, which can serve as an early for the condition. These saccadic impairments, along with prolonged fixation durations and fewer fixations during visual scanning tasks, distinguish Parkinson's patients from healthy controls and aid in from atypical . Similarly, in autism spectrum disorder (), eye tracking reveals atypical gaze patterns, such as reduced attention to like eyes and faces in dynamic scenes, which correlates with core social communication deficits. Studies using eye tracking during social interaction paradigms have shown that children with ASD allocate less gaze to socially relevant regions compared to typically developing peers, supporting its use in early screening and characterization of the disorder. In , eye tracking enables communication for individuals with severe motor impairments, such as those with (ALS). Devices like the Dynavox TD I-Series use eye gaze to control speech-generating interfaces, allowing users to select letters or symbols on a screen for text-to-speech output. These systems facilitate functional communication, with users achieving typing speeds of up to 10 words per minute in optimized setups, though real-world performance varies based on calibration and environmental factors. For patients progressing to complete , where voluntary eye movements diminish, hybrid systems integrating eye tracking with brain-computer interfaces (BCIs) provide fallback communication by combining gaze data with signals to detect intent. Eye tracking also supports rehabilitation efforts by monitoring and guiding therapeutic interventions. In vision therapy for , eye tracking assesses eye alignment and coordination during exercises, helping clinicians track improvements in binocular fusion and reduce misalignment over sessions. For post-stroke recovery, it evaluates oculomotor deficits and cognitive-motor integration through dual-task paradigms, where gaze metrics like fixation stability and accuracy predict motor function restoration and guide personalized therapy. Prototypes from the , such as gaze-driven power wheelchairs, demonstrated feasibility for mobility assistance by mapping eye movements to directional commands, enabling independent navigation for paralyzed users in controlled environments. Empirical evidence underscores the efficacy of eye tracking in these applications, with meta-analyses indicating high (over 80%) in detecting cognitive impairments via oculomotor patterns, enhancing diagnostic accuracy when combined with clinical assessments. The U.S. has cleared systems like RightEye for identifying visual tracking impairments and EarliPoint for aiding diagnosis in toddlers through gaze-based assessments.

Transportation and Safety

In automotive applications, eye tracking plays a crucial role in drowsiness detection by monitoring metrics such as blink rate and PERCLOS, which measures the percentage of time the eyes are closed by more than 80% over a specified period. PERCLOS has been validated as a reliable physiological indicator of driver fatigue, with thresholds above 25% signaling increased impairment and risk of accidents. These systems use infrared cameras to track eyelid closure in real-time, enabling non-intrusive alerts to prevent drowsy driving, which contributes to approximately 20% of fatal crashes. Integration of into advanced assistance systems (ADAS) has advanced in the , as seen in Tesla's and Full Self-Driving features, which employ the vehicle's interior cabin camera to detect inattentiveness by analyzing eye position and head orientation. This vision-based system issues escalating warnings if the driver's eyes deviate from the road for extended periods, enhancing safety during semi-autonomous operation without relying solely on . Safety metrics derived from eye tracking underscore its impact on crash prevention; for instance, glances exceeding 2 seconds nearly quadruple the risk of a or near-crash event compared to shorter durations. Real-time systems like Volvo's Driver Alert Control, introduced in the late 2000s and refined through the , use camera-based eye and head movement analysis to detect lane deviations indicative of , prompting visual and haptic alerts to restore attention. In driving scenarios involving hazard detection, eye movement patterns reveal vulnerabilities, particularly among elderly drivers who exhibit delayed fixations on pedestrians at crossings, often taking 200-500 milliseconds longer to shift compared to younger adults. Studies using eye trackers in simulated urban environments show that older drivers allocate fewer fixations to potential like crossing pedestrians, correlating with slower response times and heightened collision risks. Field trials of eye tracking-based drowsiness detection systems have demonstrated high , with some achieving over 90% accuracy in identifying impaired states during on-road testing, supporting their into production vehicles. In aviation, eye tracking assesses pilot workload by analyzing scan patterns, such as during landing approaches where studies have identified optimal gaze distributions—typically 60-70% on the instrument panel and —to maintain under high . These metrics reveal increased fixation durations and reduced velocities as workload rises, informing training protocols to mitigate errors in critical phases like . Head-up displays (HUDs) in incorporate gaze guidance by overlaying critical data in the pilot's , with eye tracking ensuring attention remains forward; integrated systems adjust symbology based on real-time gaze to reduce head-down time and enhance hazard detection. Regulatory frameworks, such as those from the (NHTSA), increasingly incorporate eye tracking data into ADAS guidelines, recommending driver monitoring systems that track gaze to mitigate and , as outlined in reports on advanced impaired driving prevention technologies. These guidelines emphasize multimodal alerts triggered by gaze deviation to support safer deployment of Level 2+ automation.

Entertainment and Research

In entertainment, eye tracking facilitates immersive interactions in gaming through gaze-based controls, allowing users to navigate menus and interfaces without traditional inputs. For instance, the Eye Tribe SDK, used in the 2010s, enabled developers to integrate eye gaze for menu selection and object interaction in early VR and PC games, enhancing accessibility and reducing reliance on hand controllers. Similarly, eye-tracked foveated rendering in Oculus headsets dynamically adjusts rendering resolution based on the user's gaze, concentrating high detail in the foveal region while lowering peripheral quality, which can reduce GPU load by approximately 50% in pixel-intensive applications. Eye tracking also informs media production by analyzing viewer attention patterns in films and television. Studies have shown that gaze data reveals how auditory cues, such as anxiety-inducing sounds, direct focus toward specific on-screen elements, aiding directors in optimizing pacing and visual composition. In commercial extensions, platforms like employ informed by eye tracking metrics to refine trailer designs, ensuring key plot hooks capture sustained and boost engagement rates. Additionally, in for like video games or educational apps, eye tracking evaluates map by measuring fixation durations on landmarks and routes, guiding improvements in color schemes and layout to minimize . In research, eye tracking supports investigations across , , and . In psychological studies of , regressions—backward eye movements to revisit text—correlate with deeper processing and better retention, as evidenced by analyses showing increased regression rates during complex narrative parsing. Neuroscience applications leverage eye tracking to model , integrating gaze data with EEG for regression-based frameworks that quantify attentional shifts in dynamic scenes. In game theory, eye-tracking experiments reveal under risk, where gaze patterns on payoff matrices predict strategic sophistication, with longer fixations on high-risk options indicating deliberative choices. Emerging integrations in and headsets further blend entertainment and . The HTC Vive Pro Eye, released in 2019, incorporates built-in eye tracking for gaze-contingent displays, enabling applications from to empirical studies. on VR cybersickness mitigation uses eye tracking to monitor pupil dilation and patterns, informing adaptive algorithms that adjust field-of-view or motion cues to reduce by up to 30% in susceptible users. One example involves pattern analysis in elderly participants during simulated walking navigation, where increased fixations on obstacles highlight attentional deficits, supporting the development of assistive overlays for real-world mobility. As of 2025, AI integration with eye tracking in VR is advancing adaptive experiences, such as personalizing content based on gaze patterns in immersive environments.

Ethical and Privacy Concerns

Privacy Issues

Eye tracking data is highly sensitive because gaze patterns can reveal intimate details about an individual's cognitive processes, such as mental workload, attention allocation, and decision-making strategies, with high accuracy in controlled studies. For instance, pupil dilation serves as a physiological indicator of emotional arousal or stress levels, while saccade trajectories and fixation durations can infer specific emotions like fear or happiness, as well as interests and intentions toward stimuli. Additionally, these metrics enable inferences about health conditions, including neurological disorders like Alzheimer's or mental health issues such as depression, through atypical gaze behaviors like prolonged fixations or reduced exploratory movements. Such revelations pose significant privacy risks, as non-transparent collection could expose users' psychological states without their awareness. Surveillance concerns arise prominently in public settings where webcam-based eye tracking occurs without explicit consent, such as in environments using overhead cameras to monitor shopper attention to products. This covert monitoring can profile consumer behaviors and preferences, potentially leading to discriminatory targeting or unauthorized across stores. A notable example is the 2013 backlash against , which sparked widespread fears of surreptitious recording in social spaces like restaurants or bars, where bystanders could be filmed and their uploaded to cloud servers without . Critics highlighted the device's potential for constant, unnoticeable , exacerbating distrust due to Google's prior controversies and prompting bans in certain venues. Under regulations like the EU's (GDPR), eye tracking data qualifies as biometric information when used for unique , such as through distinctive iris patterns or trajectories, falling under Article 9's special category of that prohibits processing without explicit . Article 9(2)(a) mandates freely given, specific, informed, and unambiguous for such data handling, with additional requirements for data protection impact assessments due to the high risks involved. Additionally, the Act (as of 2025) regulates eye tracking in systems, prohibiting remote biometric in public spaces except for under strict conditions, and requiring risk assessments for high-risk applications. Anonymization presents unique challenges, as individual scanpatterns—similar to fingerprints—remain identifiable even after aggregation, with high re- rates possible using on features like dynamics. Data breaches underscore these vulnerabilities; for example, in 2024, researchers demonstrated a "GAZEploit" attack on Apple Vision Pro's eye tracking system, allowing hackers to infer typed passwords and PINs from data with over 80% accuracy by analyzing eye movements during input. In contexts like automotive systems, gaze logs collected for driver monitoring can expose location-tied behavioral patterns, amplifying risks if devices are compromised, as continuous tracking in vehicles could reveal routines, distractions, or health indicators shared via connected networks. To mitigate these issues, on-device processing techniques perform gaze analysis locally on the user's , preventing transmission to cloud servers and reducing interception risks. further enhances privacy by training models across distributed devices, sharing only aggregated parameter updates rather than individual gaze datasets, achieving comparable accuracy to centralized methods (e.g., angular errors below 8°) while thwarting re-identification attacks. These approaches align with GDPR principles by minimizing data exposure and enabling consent-based, privacy-by-design implementations in sensitive applications.

Ethical Considerations in Use

poses significant challenges in eye tracking applications, particularly in dynamic environments where tracking occurs unobtrusively, such as in mobile apps or vehicle systems. Unlike traditional settings that mandate explicit, written , commercial deployments often rely on through , which may not adequately inform users about the collection of sensitive revealing preferences and cognitive states. For instance, passengers in shared vehicles equipped with eye trackers may remain unaware of monitoring, complicating efforts to obtain meaningful and raising ethical questions about . In involving deceptive paradigms, such as simulated natural viewing to study without alerting participants to biases, post-experiment is essential to restore and mitigate potential psychological harm. Bias and equity issues further complicate ethical deployment, as many eye tracking datasets suffer from underrepresentation of diverse ethnic groups, leading to reduced accuracy for non-dominant populations. Optical eye trackers, which rely on illumination to detect and corneal reflections, often exhibit lower trackability and precision for individuals with darker irises or narrower eye apertures common in Asian and ethnicities, with accuracy errors up to 0.91° for Asians compared to 0.57° for Africans and 0.61° for Caucasians. This measurement persists in biometric applications, where models like DeepEye show significantly higher equal error rates for , Asian, and users relative to Caucasians, underscoring the need for ethnically balanced training data to prevent discriminatory outcomes. for disabled users is also ethically fraught, as difficulties for those with motor impairments or visual conditions can exclude them from benefits in assistive technologies, perpetuating inequities unless principles are prioritized. Societal impacts of widespread eye tracking include risks of manipulation through gaze-based profiling in advertising and politics, akin to data-driven targeting scandals. In commercial settings, companies like Meta integrate eye tracking into augmented reality devices to monetize attention via hyper-personalized ads, potentially influencing consumer at subconscious levels without users' full awareness. Similarly, in politics, eye tracking enables of messages based on patterns, heightening concerns over voter as seen in past controversies. Dual-use applications in the , such as monitoring drone operators' via , amplify these risks by blending health with , where breaches could expose operational vulnerabilities. Research ethics demand stringent oversight, particularly for vulnerable populations like children in educational or developmental studies, where (IRB) guidelines emphasize tailored consent processes involving parents and assent from the child to minimize distress from head-mounted devices. In clinical trials, protocols must avoid harm by ensuring non-invasive and monitoring for fatigue, with IRBs requiring justification for any exposure to potentially revealing stimuli that could stigmatize participants. These safeguards align with broader principles to protect and beneficence in eye tracking research. As of 2025, ongoing debates advocate for international standards on rights, drawing parallels to moratoriums on unregulated facial recognition, to address function creep and ensure equitable governance across borders. Scholars call for global frameworks mandating in use and opt-outs, similar to GDPR's special category protections for biometric , to prevent societal harms from unchecked proliferation.

References

  1. [1]
    Introduction to Eye Tracking: A Hands-On Tutorial for Students and ...
    Apr 23, 2024 · Eye tracking is a technique used to objectively measure and record the direction of an individual's gaze and eye movements (Wade and Tatler, ...
  2. [2]
    An introduction to eye tracking in human factors healthcare research ...
    Eye tracking is a powerful and sophisticated tool that provides an objective glimpse into the cognition of healthcare providers, patients, caregivers, ...
  3. [3]
    A review of eye tracking for understanding and improving diagnostic ...
    Feb 22, 2019 · Eye trackers are designed to track eye gaze as a series of fixations and saccades; in other words, they are designed to track foveal attention.Search Errors · Eye Tracking In Competency... · Future Research DirectionsMissing: definition | Show results with:definition
  4. [4]
    Eye Tracking: A Comprehensive Guide To Methods And Measures
    (Eds.) (2011). Eye tracking: a comprehensive guide to methods and measures, Oxford, UK: Oxford University Press.
  5. [5]
    Ocular Motor Control (Section 3, Chapter 8) Neuroscience Online
    The extraocular muscles execute eye movements and are innervated by three cranial nerves. The muscles are attached to the sclera of the eye at one end and are ...
  6. [6]
  7. [7]
  8. [8]
    Scanpaths in Eye Movements during Pattern Perception - Science
    Scanpaths in Eye Movements during Pattern Perception. David Noton and Lawrence StarkAuthors Info & Affiliations. Science. 22 Jan 1971. Vol 171, Issue 3968. pp ...
  9. [9]
    The Upper Limit of Human Smooth Pursuit Velocity - PubMed - NIH
    In four subjects eye velocity was approximately 90% of target velocity up to a target velocity of 100 deg/sec. Eye velocity then saturated with a large ...Missing: degrees | Show results with:degrees
  10. [10]
    Smooth Pursuit - an overview | ScienceDirect Topics
    The upper limit of smooth pursuit velocity for normal humans is over 100°/sec,110 although this rate declines with aging.111 Pursuit disruption within the ...<|control11|><|separator|>
  11. [11]
    Visual guidance of smooth pursuit eye movements: sensation, action ...
    Smooth pursuit eye movements transform 100 ms of visual motion into a rapid initiation of smooth eye movement followed by sustained accurate tracking.
  12. [12]
    The Control of Gaze | Neupsy Key
    May 8, 2017 · The firing rate of the neuron rises rapidly as the eye's velocity increases from 0 degrees to 900 degrees per second; this is called the ...<|control11|><|separator|>
  13. [13]
    The Spectral Main Sequence of Human Saccades - PMC
    Saccades are the fastest type of eye movement, reaching hundreds of degrees per second and are usually completed in tens of milliseconds. Despite their ...
  14. [14]
    The diagnostic value of saccades in movement disorder patients
    Oct 15, 2015 · Saccades are the fastest eye movements (up to about 500 degrees per second) and they are very brief in duration (typically less then 100 msec) ...
  15. [15]
    Defining eye-fixation sequences across individuals and tasks - NIH
    Mean fixation durations were reported of 225–250 ms for (silent) reading, 180–275 ms for visual search, and 260–330 ms for scene viewing, amongst others.
  16. [16]
    Eye Movements and Fixation-Related Potentials in Reading: A Review
    Fixations are short periods of time, which on average last approximately 250 ms, during which information associated with the currently fixated word in the ...
  17. [17]
    Vergence eye movements in response to binocular disparity without ...
    Primates use vergence eye movements to align their two eyes on the same object and can correct misalignments by sensing the difference in the positions of the ...
  18. [18]
    Depth cues, rather than perceived depth, govern vergence - PMC
    Our findings show that depth cues rather than depth perception itself contribute to vergence accompanying saccadic movements. Perspective being congruent or ...
  19. [19]
    Neuroanatomy, Vestibulo-ocular Reflex - StatPearls - NCBI Bookshelf
    Jul 25, 2023 · This reflex keeps us steady and balanced even though our eyes and head are continuously moving when we perform most actions.
  20. [20]
    Oculovestibular Reflex - StatPearls - NCBI Bookshelf - NIH
    Vestibulo–ocular reflex is an involuntary reflex that stabilizes the visual field and retinal image during head motion by producing eye movements in a counter ...
  21. [21]
    Head-mounted eye gaze tracking devices: An overview of ... - NIH
    Jun 11, 2018 · The average duration of a single blink is between 100 ms and 400 ms. Eye tracking data recorded with video-oculography suffer from eyelid ...
  22. [22]
    A new comprehensive eye-tracking test battery concurrently ...
    Jul 9, 2019 · Typical voluntary blink duration is found to vary from 0.1 s to 0.4 s, with longer blinks reported from Electrooculography electrodes than by ...
  23. [23]
    Pioneers of eye movement research - PMC - PubMed Central
    His research interests are in the history of vision research, binocular and motion perception, and the interplay between visual science and art.
  24. [24]
    Re-examining the Pioneering Studies on Eye Movements in Aviation
    Background: Over the last century, military and civilian researchers have used eye tracking techniques to solve many challenges faced by the aviation industry, ...
  25. [25]
    A Method of Measuring Eye Movemnent Using a Scieral Search Coil ...
    Using two magnetic fields in quadrature phase and two coils on the lens, one may measure horizontal, vertical and torsional eye movements simultaneously.
  26. [26]
    Yarbus, eye movements, and vision - PMC - PubMed Central - NIH
    Yarbus's eye movement work progressed in the 1950s and 1960s, with regular journal publications. In 1965 he wrote the book which, in its 1967 English ...
  27. [27]
    Applied Science Laboratories - Eye Tracking Expertise
    ASL has a history of introducing leading developments for the eye tracking industry. We were the first to introduce video-based eye trackers, head-mounted ...
  28. [28]
    Lightweight eye-tracker suits sports - Optics.org
    Oct 7, 2004 · ASL, based in Massachusetts, has been developing eye-tracking systems for 30 years and has customers across a number of market sectors such as ...
  29. [29]
    PupilNet: Convolutional Neural Networks for Robust Pupil Detection
    Evaluation of a low-cost open-source gaze tracker. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, pages 77–80. ACM, 2010. [28] ...
  30. [30]
    About Face ID advanced technology - Apple Support
    Face ID recognizes if your eyes are open and your attention is directed towards the device. This makes it more difficult for someone to unlock your device ...Advanced Technologies · Security Safeguards · PrivacyMissing: 2017 tracking integration
  31. [31]
    Tobii: Global leader in eye tracking for over 20 years
    Tobii is on a mission to improve the world with our eye tracking and attention computing technology that understands human attention and intent.Tobii Pro Eye Tracker Manager · Tobii Gaming · The Tobii Blog · Tobii NewsroomMissing: 2020s | Show results with:2020s
  32. [32]
  33. [33]
    Open Source eye tracker for smartphone devices using Deep Learning
    Aug 25, 2023 · In this manuscript, we present an open-source implementation of a smartphone-based gaze tracker that emulates the methodology proposed by a GooglePaper.
  34. [34]
    Accelerating eye movement research via accurate and affordable ...
    Sep 11, 2020 · We overcome the high cost and lack of scalability of specialized eye trackers by demonstrating accurate smartphone-based eye tracking without ...
  35. [35]
    Dynamic Cyclovergence during Vertical Translation in Humans
    Jul 6, 2011 · Three-dimensional eye movements were recorded binocularly using the dual scleral search coil technique. The resolution of the system is 0.1 ...
  36. [36]
    Vestibulo-Oculomotor Reflex Recording Using the Scleral Search ...
    These metallic coils create alternate orthogonal magnetic fields. The scleral search coils are made up of 2 orthogonal metallic coils in a single silicone ring ...Missing: vergence | Show results with:vergence
  37. [37]
    Tracking the eye non-invasively: simultaneous comparison of the ...
    Aug 13, 2012 · Robinson, D. A. (1963). A method of measuring eye movement using a scleral search coil in a magnetic field. IEEE Trans. Biomed. Eng. 10, 137 ...Introduction · Materials and Methods · Results · Discussion
  38. [38]
    Recording Three-Dimensional Eye Movements: Scleral Search ...
    This study compared the performance of a video-based infrared three-dimensional eye tracker device (Chronos) with the scleral search coil method.
  39. [39]
    Large eye–head gaze shifts measured with a wearable eye tracker ...
    Jan 10, 2024 · The scleral coil method is very precise but is invasive, and it also requires the cornea to be anesthetized. Nowadays, the scleral coil could ...
  40. [40]
    Video-oculography eye tracking towards clinical applications: A review
    This paper presents a study of the existing eye tracking video-oculography techniques and also analyzes the importance of measuring slight head movements for ...
  41. [41]
    Robust eye tracking based on multiple corneal reflections for clinical ...
    Mar 2, 2018 · Video oculography (VOG) has become the most popular eye tracking technique ... Several approaches used two IR light sources, one placed near the ...
  42. [42]
    [PDF] Eye Tracking in Optometry: A Systematic Review - BOP Serials
    Aug 16, 2023 · Across all references, these devices have an imaging capture frequency between 60 Hz and 2000 Hz, and are based on the infrared light technique.
  43. [43]
    The Different Kinds of Eye Tracking Devices - Bitbrain
    May 9, 2025 · Most modern eye tracking systems fall into one of four categories: Head-stabilized, remote, mobile (head-mounted), and embedded (integrated).
  44. [44]
    The influence of calibration method and eye physiology on ...
    Sep 7, 2012 · Data quality in space and time. The accuracy of eyetracking data is best directly after calibration, which is why many eyetrackers have built- ...Missing: dilation | Show results with:dilation
  45. [45]
    Pupil Size Affects Measures of Eye Position in Video Eye Tracking
    Jun 20, 2016 · Video eye trackers rely on the position of the pupil centre. However, the pupil centre can shift when the pupil size changes.<|separator|>
  46. [46]
    A lightweight framework for deep learning-based eye tracking using ...
    Mar 31, 2025 · Deep learning methods have significantly enhanced both the accuracy and robustness of gaze estimation techniques, as evidenced by multiple ...
  47. [47]
    Accelerating eye movement research via accurate and affordable ...
    Sep 11, 2020 · We leverage machine learning to demonstrate accurate smartphone-based eye tracking without any additional hardware.
  48. [48]
    Electrooculography - an overview | ScienceDirect Topics
    The EOG signal and strength of eye movements depend on the users' capabilities to move their eye muscles. Therefore, the user's motor limitations will delimit ...
  49. [49]
    Comparing Eye Tracking with Electrooculography for Measuring ...
    Oct 20, 2016 · EOG records eye movements by measuring electrical potential differences between two electrodes. This takes advantage of the fact that the human ...
  50. [50]
    Human Eye Tracking Through Electro-Oculography (EOG): A Review
    This paper presents a new method to remove baseline drift and noise by using a differential electrooculography (EOG) signal based on a fixation curve (DOSbFC) ...
  51. [51]
    Measurement of saccadic eye movements by electrooculography for ...
    Jul 16, 2019 · The recorded EOG potentials calibrated into gaze angle in degrees was more accurate for the horizontal direction than for the vertical ...
  52. [52]
    Electrooculograms for Human–Computer Interaction: A Review - PMC
    Jun 14, 2019 · The main benefit of utilizing EOG as an input source of human–computer interaction (HCI) is that the eye-movements can be estimated using low- ...
  53. [53]
    Development of an electrooculogram-based eye-computer interface ...
    Sep 8, 2017 · Electrooculogram (EOG) can be used to continuously track eye movements and can thus be considered as an alternative to conventional camera-based ...
  54. [54]
    Comparison of EOG and VOG obtained eye movements during ...
    Sep 1, 2022 · This study demonstrated the use of EOG to record eye movements during head impulse testing for the first time.
  55. [55]
  56. [56]
    [PDF] The EyeLink® 1000 Plus Eye Tracker - SR Research
    The real-time data access and consistent temporal resolution make it the ideal head-free remote eye tracker for gaze-contingent and gaze-controlled applications ...
  57. [57]
    Eye Movement and Pupil Measures: A Review - Frontiers
    3.3 Scan Path Analysis​​ A scan path is a sequence of fixations and saccades that describe the pattern of eye movements during a task (Salvucci and Goldberg, ...Eye Movement Analysis · Eye Movement Measures · Pupil Measures · Discussion
  58. [58]
    Importing Data from Eyetracking devices - MNE-Python
    Oct 15, 2025 · Eyelink files can produce data on saccadic velocity, resolution, and head position for each sample in the file. MNE will read in these data if ...
  59. [59]
    [PDF] Identifying Fixations and Saccades in Eye-Tracking Protocols
    In this paper we propose a taxonomy of fixation identification algorithms that classifies algorithms in terms of how they utilize spatial and temporal ...Missing: scholarly | Show results with:scholarly
  60. [60]
    The main sequence, a tool for studying human eye movements
    In the pulse width modulation model, the duration of the controller signal pulse determines saccadic amplitude and peak velocity. The high-frequency burst of ...
  61. [61]
    [PDF] Eye Gaze Metrics and Analysis of AOI for Indexing Working Memory ...
    Jun 17, 2019 · We developed a detailed saccade and fixation feature set using the following qualifiers: gender, number of fixa- tions, fixation duration ...<|separator|>
  62. [62]
    An Examination of Recording Accuracy and Precision From Eye ...
    May 23, 2018 · We designed a calibration verification protocol to augment independent quality assessment of eye tracking data and examined whether accuracy and precision ...Missing: validation | Show results with:validation
  63. [63]
    What does accuracy mean and how is it measured for the EyeLink ...
    Sep 7, 2020 · We generally recommend aiming for average error values of < 0.5 degrees and a maximum error of < 1.0 degrees.Missing: protocols | Show results with:protocols
  64. [64]
    An Introduction to EyeLink Data (EDF Files) - SR Research
    Dec 7, 2021 · When converting the EDF data to ASCII, three other "sample level" data types can be added - Velocity, Resolution (Pixels per degree) and Input ( ...
  65. [65]
    (PDF) Filtering Eye-Tracking Data From an EyeLink 1000
    Oct 19, 2023 · Here, we compare five filters in their ability to preserve signal and remove noise. We compared the proprietary STD and EXTRA heuristic filters ...Missing: assurance | Show results with:assurance
  66. [66]
    Comparing Heuristic, Savitzky-Golay, IIR and FIR Digital Filters - MDPI
    In this paper, we compared the frequency response of 5 filters applied to eye-movement fixation signals recorded from an EyeLink 1000 eye-tracking device. We ...Missing: quality outlier
  67. [67]
    A Systematic Review of Visualization Techniques and Analysis ...
    Jul 13, 2022 · Papers focusing on purely traditional 2D eye-tracking visualization techniques, such as scanpaths or gaze plots, heat maps or attentional maps, ...
  68. [68]
    A Systematic Review of Visualization Techniques and Analysis ...
    Jul 12, 2022 · This systematic literature review presents an update on developments in 3D visualization techniques and analysis tools for eye movement data in 3D environments.
  69. [69]
    [PDF] Aggregate Gaze Visualization with Real-time Heatmaps - Miriah Meyer
    Heatmaps are a popular and intuitive visualization technique for en- coding quantitative values derived from gaze points or fixations at corresponding image or ...
  70. [70]
  71. [71]
  72. [72]
  73. [73]
    Analyze your eye tracking data with our software solutions - Tobii
    Our software solutions help you analyze eye tracking data and build applications leveraging the insights. The perfect tool to use with Tobii eye trackers.Missing: OGAMA PyGaze
  74. [74]
    OGAMA (OpenGazeAndMouseAnalyzer): An open source software ...
    This software allows recording and analyzing eye- and mouse-tracking data from slideshow eyetracking experiments in parallel. OGAMA is freeware, written in C#.Downloads · Further Downloads · Slide Design Module · Record Module ScreenshotMissing: 2020s | Show results with:2020s
  75. [75]
    PyGaze | Open source eye-tracking software and more.
    Oct 26, 2020 · An open-source toolbox for eye tracking in Python. It also features related projects, such as PyGaze Analyser and a webcam eye-tracker.Webcam Eye Tracker · Downloads · Eye tracking · InstallationMissing: Studio OGAMA
  76. [76]
    Open-source software designed to analyze eye and mouse ...
    OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs. Published: November 2008.
  77. [77]
    https - DOI
    No information is available for this page. · Learn why
  78. [78]
    Review and Evaluation of Eye Movement Event Detection Algorithms
    Nov 15, 2022 · The methods improve the existing event detection methods, I-VT and I-DT, by integrating both and adding one more threshold velocity.Missing: seminal | Show results with:seminal
  79. [79]
    On metrics for measuring scanpath similarity
    Aug 10, 2020 · This paper presents in one place a review of these metrics, axiomatic analysis of gaze metrics for scanpaths, and careful analysis of the discriminative power ...
  80. [80]
    Fixation duration and the learning process: an eye tracking study ...
    Such classification suggests short fixations (50-150 ms) to be of ambient type while long fixations around 300-500 ms are considered to be focal (17). These two ...
  81. [81]
    Interpretable Machine Learning Models for Three-Way Classification ...
    Feb 9, 2021 · The paper is focussed on the assessment of cognitive workload level using selected machine learning models. In the study, eye-tracking data were ...
  82. [82]
    Eye movement analysis with hidden Markov models (EMHMM) with ...
    Apr 30, 2021 · The eye movement analysis with hidden Markov models (EMHMM) method provides quantitative measures of individual differences in eye-movement pattern.Missing: multimodal | Show results with:multimodal
  83. [83]
    Multimodal consumer choice prediction using EEG signals and eye ...
    Jan 8, 2025 · We propose a novel multimodal approach to predict consumer choices by integrating EEG and ET data. Noise from EEG signals is mitigated using a bandpass filter.
  84. [84]
    Reliability of a Smooth Pursuit Eye-Tracking System (EyeGuide ...
    Jun 16, 2023 · Intraclass correlation coefficient (ICC) analysis showed the EyeGuide Focus had overall good reliability (ICC 0.79, 95%CI: 0.70, 0.86). However, ...
  85. [85]
  86. [86]
  87. [87]
    [PDF] Accuracy and Precision of Eye Tracking and Implications for Design
    We focus on remote video-based eye tracking, in which an eye-tracking camera is attached to or integrated in a device's screen. Computer Interaction by Eye Gaze.
  88. [88]
    [PDF] A history of eye gaze tracking - HAL
    Jan 24, 2008 · In this paper, we provide a his- torical review of eye tracker systems and applications. We address the use of gaze tracking in order to adapt ...
  89. [89]
    An analysis of the saccadic system by means of double step stimuli
    Saccades are prepared in two steps: first, direction is decided, then amplitude is calculated. Preparatory processes of different saccades may overlap. The ...
  90. [90]
    [PDF] The EyeLink® 1000 Plus Eye Tracker - SR Research
    The real-time data access and consistent temporal resolution make it the ideal head-free remote eye tracker for gaze-contingent and gaze-controlled applications ...Missing: latency | Show results with:latency
  91. [91]
    Saccade landing position prediction for gaze-contingent rendering
    This paper describes the derivation of a model for predicting saccade landing positions and demonstrates how it can be used in the context of gaze-contingent ...
  92. [92]
    Ultimate Guide to Eye-Tracking Research
    May 15, 2025 · In this guide, our CPG market research experts explain how eye-tracking works, share best practices for designing effective shelf tests, and ...
  93. [93]
  94. [94]
    3 reasons why you need to do a virtual shelf test - Zappi
    May 10, 2024 · Also known as eye tracking shelf tests, this type of shelf test involves using heat map technology to monitor and analyze where shoppers look ...
  95. [95]
    Use Cases - Testing Advertisements with Eye-Tracking - RealEye.io
    Evaluate your advertising effectiveness using eye-tracking and attention tracking, online. ... Is it possible to conduct A/B testing for advertising with RealEye?
  96. [96]
    Eye Tracking Technology Uses in Market Research | Kadence
    Eye tracking is the most effective way to test what elements of a site capture attention (or cause someone to click away), and what influences desired actions.
  97. [97]
    Use Eye Tracking to Create Ads that Capture Attention - Tobii
    Create advertising with unbiased eye tracking insights. Eye tracking gives you accurate answers on whether your ad captures attention and engages viewers.Missing: B | Show results with:B
  98. [98]
    F-Shaped Pattern of Reading on the Web: Misunderstood, But Still ...
    Nov 12, 2017 · In the F-shaped scanning pattern is characterized by many fixations concentrated at the top and the left side of the page.
  99. [99]
    F-Shaped Pattern For Reading Web Content (Original Study)
    Apr 16, 2006 · Eyetracking visualizations show that users often read Web pages in an F-shaped pattern: two horizontal stripes followed by a vertical stripe.Missing: UI | Show results with:UI
  100. [100]
    Text Scanning Patterns: Eyetracking Evidence - NN/G
    Aug 25, 2019 · Eyetracking research shows that there are 4 main patterns that people use to scan textual information on webpages: F-pattern, spotted pattern, layer-cake ...
  101. [101]
  102. [102]
    Eye Tracking Testing - Optimizely Support
    Apr 15, 2025 · Eye tracking testing forecasts the first seconds of viewing a message. ... During scanning, the focus rests several times on the fixation points.Missing: metrics engagement
  103. [103]
    Top 8 user engagement metrics to track and measure - Optimizely
    Jul 24, 2024 · In this article, we explore which user engagement metrics are most important and look at how you can integrate product data with insights from across channels.Missing: eye fixation ratios
  104. [104]
    Case study: P&G 'saves up to 25%' of its digital ad budget with eye ...
    P&G worked with eye-tracking technology firm Sticky to optimise its digital branding campaigns. This case study looks at how the consumer packaged goods ...
  105. [105]
    Designing for the driver using eye tracking - Tobii
    Jun 24, 2021 · Wearable eye tracking devices allow designers to measure the gaze patterns of drivers while using the car in a real setting or engaging with a simulator.Missing: examples | Show results with:examples
  106. [106]
    Insights on eye tracking | Packaging World
    Apr 9, 2012 · “We use eye tracking frequently to assess new label design for major brands. We would not implement a graphic label design change without ...
  107. [107]
    Eye Tracking Market: Global Analysis and Industry
    The global eye tracking market was valued at US$ 1194 million in the year 2023 and is expected to reach US$ 7253 million by 2030 with a growing CAGR of 29.4%.
  108. [108]
    North America Eye Tracking Solutions Market Size & Share Analysis
    Nov 22, 2024 · The North America Eye Tracking Solutions Market is growing at a CAGR of 19.70% over the next 5 years. SR Research Ltd., Seeing Machines Inc.
  109. [109]
    Eye Tracking in Parkinson's Disease: A Review of Oculomotor ...
    Mar 31, 2025 · Alterations in visual processing, such as delayed or impaired saccades, have been shown to affect eye–hand coordination, impairing the ability ...
  110. [110]
    Recent advances (2022–2024) in eye-tracking for Parkinson's disease
    May 21, 2025 · PD patients showed slower saccades, higher error rates, and impaired visual scanning with prolonged fixation duration and fewer fixations. A VR- ...
  111. [111]
    Eye tracking demonstrates the influence of autistic traits on social ...
    Oct 28, 2025 · These studies focused on understanding how individuals with ASD direct their attention toward social cues, such as faces and social settings, ...
  112. [112]
    Eye tracking in early autism research - PMC - PubMed Central
    Eye tracking has the potential to characterize autism at a unique intermediate level, with links 'down' to underlying neurocognitive networks.
  113. [113]
    TD I-Series - eye tracking-enabled SGD - Tobii Dynavox US
    Generate a synthetic voice using only your eyes with the TD I-Series eye gaze-enabled speech generating device – medical grade assistive technology for AAC.Missing: 2020s | Show results with:2020s<|separator|>
  114. [114]
    Using large language models to accelerate communication for eye ...
    Nov 1, 2024 · However, gaze typing is slow (usually below 10 words per minute (WPM)), creating a gap of more than an order of magnitude below typical speaking ...
  115. [115]
    Eye-Tracking and BCI Integration for Assistive Communication in ...
    Sep 27, 2025 · Eye-Tracking and BCI Integration for Assistive Communication in Locked-In Syndrome: Pilot Study with Healthy Participants. Patients with ...
  116. [116]
    Eye Tracking Problems - Toledo Vision Therapy
    Vision therapy is very effective in correcting eye tracking problems and produces lasting results. Vision Therapy eye exercises can improve eye tracking at any ...
  117. [117]
    Eye tracking-based dual task in rehabilitation of motor and cognitive ...
    May 20, 2025 · Eye tracking has a promising future in motor-cognitive dual-task rehabilitation for stroke patients with motor-cognitive impairment.
  118. [118]
    testing a gaze-driven power wheelchair for individuals with severe ...
    Mar 22, 2010 · Firstly, the prototype is controlled by eye movements instead of by a normal joystick. Secondly, the prototype is equipped with a sensor ...
  119. [119]
    The effectiveness of eye tracking in the diagnosis of cognitive ...
    Jul 12, 2021 · This review showed that ET technology could be used to detect the decline in CI, clinical use of ET techniques in combination with other tools to assess CI can ...
  120. [120]
    RightEye Eye-Tracking System Receives FDA 510(k) Clearance
    Oct 9, 2018 · The FDA has cleared the RightEye™ system for recording, viewing, and analyzing eye movements in support of identifying visual tracking ...Missing: healthcare | Show results with:healthcare
  121. [121]
    First-Of-Its-Kind FDA-Authorized Device for Early Diagnosis of ...
    The tool, called EarliPoint TM Evaluation, is authorized for use in children between 16 and 30 months of age to aid in the diagnosis and assessment of autism.
  122. [122]
    PERCLOS-based technologies for detecting drowsiness
    PERCLOS is one of the most validated indices used for the passive detection of drowsiness, which is increased with sleep deprivation, after partial sleep ...Missing: automotive | Show results with:automotive
  123. [123]
    Using PERCLOS for Effective Driver Drowsiness Detection
    Jan 23, 2024 · 1. Utilizing an eye-tracking system or camera-based monitoring system to capture continuous images or videos of the driver's face. · 2. Analyzing ...
  124. [124]
    Technologies for detecting and monitoring drivers' states
    Oct 30, 2024 · Driver fatigue or drowsiness detection techniques can significantly enhance road safety measures and reduce traffic accidents.
  125. [125]
    Tesla starts using in-car camera for Autopilot driver monitoring
    May 27, 2021 · The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged. Camera data does not ...<|separator|>
  126. [126]
    Tesla's camera-based driver monitoring still inadequate ... - Teslarati
    Dec 23, 2021 · In a recent article, Consumer Reports noted that Tesla's camera-based driver monitoring system failed to keep a driver's attention on the road.
  127. [127]
    Keep Your Eyes on the Road: Young Driver Crash Risk Increases ...
    Crash risk increased with the duration of single longest glance during all secondary tasks (OR=3.8 for >2s) and wireless secondary task engagement (OR=5.5 for > ...Missing: 3x | Show results with:3x
  128. [128]
    VOLVO LAUNCHES WORLD-FIRST TECHNOLOGY TO ALERT ...
    Volvo's Driver Alert Control (DAC) monitors car movements, assessing fatigue/distraction. It uses a camera, sensors, and a control unit, alerting drivers with ...
  129. [129]
    Do older drivers (65+) exhibit significant impairments in hazard ...
    These findings suggest that for drivers over 65, both hazard prediction and attentional performance decline to levels comparable to those of inexperienced ...Do Older Drivers (65+)... · 2. Methods · 2.2. Materials And EquipmentMissing: movements | Show results with:movements
  130. [130]
    Pilot Study on Gaze Characteristics of Older Drivers While Watching ...
    Oct 10, 2024 · A literature review of eye tracking in older drivers [40] has found that they have reduced hazard detection [25], reduced scanning behavior [28] ...
  131. [131]
    a real-time driver drowsiness detection system | ROBOMECH Journal
    May 15, 2025 · The experimental results demonstrate the effectiveness of our approach, achieving 90% accuracy, 100% precision, 83.3% recall, and an F1-score of ...Missing: trials | Show results with:trials
  132. [132]
    [PDF] Airline Pilot Scan Patterns During Simulated ILS Approaches
    A nonintrusive oculometer system was used to track the pilot eye-point-of -regard throughout the approach. in general, the pilots use different scan techniques ...
  133. [133]
    [PDF] Eye Tracking Metrics for Workload Estimation in flight Deck Operation
    This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same ...Missing: landing | Show results with:landing
  134. [134]
    [PDF] INTEGRATING EYE-TRACKING & HEAD-UP DISPLAY IN PILOT ...
    Head-Up Displays (HUDs) and Head-Up Guidance Systems (HGS) introduce value beyond real-time gaze tracking. When integrated with eye-tracking solutions ...
  135. [135]
    [PDF] ANPRM: Advanced Impaired Driving Prevention Technology - NHTSA
    Dec 12, 2023 · This document also presents three regulatory options for how the agency might mitigate driver impairment: blood alcohol content detection, ...
  136. [136]
    [PDF] GAO-24-106255, Driver Assistance Technologies: NHTSA Should ...
    Mar 28, 2024 · Depending on their intended function, the technologies may communicate warnings to the driver through visual, auditory, or tactile alerts, or.
  137. [137]
    EyeTribe/documentation: Documentation and API Reference - GitHub
    Applications that can benefit from eye tracking include games, OS navigation, e-books, market research studies, and usability testing. The Eye Tribe Tracker ...
  138. [138]
    Tech Note: Mask-based Foveated Rendering with Unreal Engine 4
    It allows you to effectively reduce pixel shading cost from world rendering, provides good visual quality when dropping 50% of the pixels, and is compatible ...
  139. [139]
    Selecting the best artwork for videos through A/B testing
    May 3, 2016 · This blog post sheds light on the groundbreaking series of A/B tests Netflix did which resulted in increased member engagement.Missing: eye | Show results with:eye
  140. [140]
  141. [141]
    What Does Your Gaze Reveal About You? On the Privacy ...
    Mar 6, 2020 · Certain eye tracking measures may even reveal specific cognitive processes and can be used to diagnose various physical and mental health ...<|separator|>
  142. [142]
    From Gaze to Data: Privacy and Societal Challenges of Using Eye ...
    May 25, 2025 · However, gaze data is highly sensitive and can reveal several user attributes, such as cognitive states, emotions, and even medical conditions.2 Privacy Risks And Societal... · 3 Privacy Challenges... · 4 Privacy Risks And Further...
  143. [143]
    Stealing a Glance: Eye Tracking, AR, & Privacy
    The science of tracking eye movements to determine what draws our interest has been around for more than a century. Retailers, product designers, and ...
  144. [144]
    Google Glass: is it a threat to our privacy? - The Guardian
    Mar 6, 2013 · Is an adult, who happens to be visible in a camera's peripheral vision in a bar, consenting? And who owns – and what happens to – that data?
  145. [145]
    Art. 9 GDPR – Processing of special categories of personal data
    Rating 4.6 (10,112) Art. 9 GDPR Processing of special categories of personal data. Paragraph 1 shall not apply if one of the following applies: Suitable Recitals (46)Missing: eye tracking
  146. [146]
    Apple Vision Pro's Eye Tracking Exposed What People Type - WIRED
    Sep 12, 2024 · But your eyes could also leak more secretive information: your passwords, PINs, and messages you type. Today, a group of six computer scientists ...Missing: examples | Show results with:examples
  147. [147]
    Eye Tracking in Driver Attention Research—How Gaze Data ...
    Eye tracking (ET) has been used extensively in driver attention research. Amongst other findings, ET data have increased our knowledge about what drivers look ...
  148. [148]
    [PDF] Privacy-Aware Eye Tracking: Challenges and Future Directions
    Mar 27, 2023 · Large-scale collection of gaze data poses signifi- cant privacy risks, particularly if these data are centralized and analyzed automatically ...
  149. [149]
    Ethical Considerations When Using a Mobile Eye Tracker in a ...
    Nov 2, 2020 · The protocol therefore required attention to unique ethical considerations—including consent, privacy and confidentiality, HIPAA compliance, ...
  150. [150]
    Eye-tracking data quality as affected by ethnicity and experimental ...
    Apr 23, 2013 · This study reports trackability, accuracy, and precision as indicators of eye-tracking data quality as measured at various head positions and light conditions.Missing: datasets underrepresentation
  151. [151]
    [PDF] Fairness in Oculomotoric Biometric Identification
    One possible explanation could be a measurement bias within eye-tracking devices. Our study underscores the need to collect ethnically diverse and balanced data ...
  152. [152]
    Companies are increasingly tracking eye movements — but is it ...
    Oct 16, 2022 · The use of eye-tracking technology should be strictly controlled by external regulators. Users should always have the legally defined right.
  153. [153]
  154. [154]
    The ethical dimension of personal health monitoring in the armed ...
    Aug 10, 2024 · This review demonstrates that PHM in the armed forces is primarily approached from a utilitarian perspective, with a focus on its benefits.
  155. [155]
    Conducting head-mounted eye-tracking research with young ...
    Mar 4, 2024 · In this paper, we describe our research practices of using head-mounted eye trackers with 41 autistic children and 17 children with increased likelihood of ...<|separator|>
  156. [156]
    Eye tracking in developmental cognitive neuroscience
    Eye tracking is a popular research tool in developmental cognitive neuroscience for studying the development of perceptual and cognitive processes.