Fact-checked by Grok 2 weeks ago

Pointing device

A pointing device is a non-keyboard that enables a user to control the position of a pointer or cursor on a computer display, facilitating the selection, navigation, and manipulation of on-screen elements in graphical user interfaces. These devices translate physical movements or gestures into digital coordinates, supporting precise spatial input essential for human-computer interaction. The evolution of pointing devices began in the mid-20th century, with early innovations like the developed by Robert Everett in 1950 at MIT's Lincoln Laboratory for diagnostic purposes on the computer, marking one of the first position-sensing inputs. The followed in 1952, invented by Tom Cranston, Fred Longstaff, and Kenyon Taylor at for the DATAR system, providing an inverted mouse-like control for radar displays. A pivotal advancement came in 1964 when and William English at Stanford Research Institute created the first mouse, a wooden prototype with wheels that tracked movement on a desk surface, demonstrated publicly in 1968 and patented in 1970. By the 1980s, commercial versions proliferated, including optical mice in 1982 by Steven Kirsch at Mouse Systems, which used LED light for tracking without mechanical parts. Common types of pointing devices include the , a handheld device with buttons for clicking and dragging; the , where a ball is rotated by fingers while the body remains stationary; the , a flat surface sensitive to finger gestures common in laptops; the , used for directional control in and simulations; the graphics tablet, which pairs a with a drawing surface for precise input in design applications; and touchscreens, which detect direct finger or contact on the display itself. Pointing devices can be classified as relative (tracking movement deltas, like mice) or absolute (mapping position directly to screen coordinates, like touchscreens). Modern developments incorporate capabilities, as pioneered by Nimish Mehta's 1982 system at the , enabling gestures like pinching and swiping on devices such as smartphones and tablets. These devices have become integral to , enhancing in , , and creative tasks while standards like ISO 9241-9 guide their ergonomic evaluation for performance metrics such as speed and accuracy. Ongoing focuses on wearable and gesture-based alternatives, such as finger-mounted or head-tracking systems, to address and needs.

Introduction

Definition and Purpose

A pointing device is a input , distinct from keyboards, that enables users to control the position of a cursor or pointer on a (GUI) for tasks such as navigation, selection, and object manipulation. These devices translate physical user movements into corresponding digital coordinates, supporting both continuous spatial control in two or three dimensions and actions via integrated buttons, sensors, or pressure detection. Key characteristics include high precision for pointer stability (typically within 0.25–1.3 mm), compatibility with drag-and-drop operations, and adaptability to various interaction paradigms like absolute or relative positioning. The primary purpose of pointing devices is to facilitate intuitive and efficient human-computer interaction (HCI) by allowing precise spatial input that mimics human gestures, such as pointing or dragging, thereby reducing compared to text-based commands. In HCI, they serve as essential bridges between physical actions and virtual environments, enabling rapid, reversible operations with immediate visual feedback to support tasks like clicking on icons, through content, or gesturing in immersive systems. This translation of analog movements to digital signals enhances across novice and expert users, promoting and error recovery in diverse applications from desktop computing to mobile interfaces. Pointing devices emerged as critical components in the shift from command-line interfaces to GUI-based systems, a transition accelerated by innovations at PARC in the late 1970s and 1980s that popularized point-and-click paradigms with icons, windows, and menus. This evolution made computers more approachable by replacing memorized syntax with visual, spatial controls, fundamentally relying on pointing devices to democratize beyond specialized users.

Historical Development

The origins of pointing devices trace back to the mid-20th century, with early concepts emerging from military applications. In 1946, British engineer Ralph Benjamin developed the first at the Royal Navy Scientific Service as part of a post-World War II plotting system, allowing operators to control cursor movement on displays without direct physical contact. This stationary device marked an initial step toward indirect input mechanisms, though it remained classified and uninfluenced by later civilian innovations. A pivotal advancement came in 1964 when and his team at Stanford Research Institute invented the , a handheld device with a wooden shell and two perpendicular wheels for tracking X-Y movement on a surface. Engelbart's prototype used potentiometers to translate motion into electrical signals, enabling precise cursor control on early computer interfaces. This invention was publicly demonstrated in 1968 during the "Mother of All Demos," showcasing its potential for human-computer interaction and inspiring future graphical user interfaces. Commercialization accelerated in the 1970s and 1980s, integrating pointing devices with emerging GUIs. The computer, released in 1973 by PARC, was the first to pair a three-button ball with a display and windows-based interface, influencing modern desktop paradigms. Concurrently, touchscreen technology advanced; E.A. Johnson described the first finger-driven capacitive in 1965 at the UK's Royal Radar Establishment, using a grid of capacitors to detect touch positions. By 1977, had implemented and commercialized capacitive touchscreens for control room interfaces, equipping the with panels that responded to finger proximity without physical pressure. Apple's (1983) and Macintosh (1984) then popularized the for consumer use, bundling it with intuitive GUIs that drove widespread adoption in personal . The 1990s saw expansion into portable and specialized devices. introduced the TrackPoint pointing stick in 1992 on its 700 series laptops, a pressure-sensitive joystick embedded in the keyboard for thumb-operated cursor control. In 1994, Cirque Corporation's GlidePoint touchpad, licensed to , debuted as the first widely available capacitive trackpad for notebooks, enabling finger gestures on a flat surface. Graphics tablets also matured, with releasing its first cordless, battery-free stylus tablet (WT-460M) in 1984, evolving through the 1980s into pressure-sensitive digitizers for professional design and illustration. Advancements in the and emphasized connectivity, , and motion sensing. Logitech launched its first RF mouse, the MouseMan Cordless, in 1991, but refined it into optical models by 1999, eliminating mechanical balls for LED-based surface tracking. Nintendo's in 2006 introduced motion controllers with accelerometers and infrared sensors for gesture-based pointing in gaming. Apple's in 2007 popularized capacitive screens, supporting pinch-to-zoom and multi-finger gestures on mobile devices. Throughout the , accelerometers and gyroscopes integrated into pointing devices, enhancing tilt detection in mice and enabling spatial tracking in wearables and controllers. By the 2020s, trends have focused on , performance, and integration. Wireless designs dominate, with ergonomic vertical mice like the MX Master 3S (2022) reducing wrist strain through natural hand positioning. High-DPI gaming mice, such as Razer's DeathAdder V3 (2022) offering up to 30,000 DPI, provide ultra-precise tracking for competitive .

Classification Frameworks

Buxton's Taxonomy

Buxton's taxonomy provides a foundational framework for classifying computer input devices, particularly those used for and tasks, by considering their physical properties and how they with human capabilities. Developed in the early , it categorizes devices based on two primary axes: the number of spatial dimensions they (ranging from 1D to ) and the type of paradigm they employ (such as , , , or /), while distinguishing devices for non-spatial inputs like buttons. This approach emphasizes the transduction of human motor actions into digital signals, enabling designers to evaluate device suitability for specific interactions like cursor positioning or object selection. In terms of spatial dimensions for continuous devices, 1D devices manage linear controls, such as sliders or rotary potentiometers, for adjusting values along one . 2D devices, common for pointing, support planar movements, exemplified by tablets, mice, and touchscreens. 3D devices extend to volumetric tracking, like or motion trackers, for full spatial navigation. The paradigms further refine this: position uses from the device's to screen coordinates (e.g., a graphics tablet); interprets relative speed and direction (e.g., a ); rate integrates acceleration for proportional response (e.g., a in rate mode); and force/pressure relies on inputs like applied force without displacement (e.g., pressure-sensitive joysticks). The is inherently human-centered, drawing from the motor and sensory systems of the body—such as fine hand movements for precise pointing versus whole-body gestures for immersion—and incorporating modalities like tactile confirmation or visual cursor display. For instance, the exemplifies 2D velocity control, where hand motion translates to relative cursor velocity on screen; a represents 2D position control via direct absolute mapping; and a often functions as 2D rate control, where deflection sets a sustained rate. This grounding in human physiology helps predict device and task efficiency. A key strength of Buxton's framework lies in its ability to facilitate device comparisons and metaphors, such as equating a tablet to a mouse in terms of , thereby guiding predictions and choices based on input dimensionality. However, it has limitations, particularly in addressing post-2000s developments like multimodal gesture-based inputs or hybrid discrete-continuous interactions, which extend beyond its original focus on manual, continuous controls. Originating from Buxton's paper "Lexical and Pragmatic Considerations of Input Structures," the has profoundly influenced human-computer interaction (HCI) standards, including device-independent graphics systems like GKS and ongoing prototyping practices in interface . This classification schema applies directly to pointing tasks by delineating how devices support acquisition and manipulation phases, complementing Buxton's later three-state model of graphical input.

Buxton's Three-State Model

Buxton's Three-State Model provides a framework for understanding graphical input tasks by dividing them into three distinct semantic states that describe the interaction between users and pointing devices. Introduced in a collaboration with Hill and Rowley, and later refined in Buxton's 1990 paper, the model builds on his earlier of input devices to emphasize the syntax of user-device interactions, focusing on how devices support task phases rather than just hardware attributes. The model categorizes input into State 0 (Out of Range), State 1 (Tracking), and State 2 (Dragging). In State 0, the device has no effect on the system, such as when a is lifted off a tablet surface, allowing for repositioning without unintended actions. State 1 involves acquiring or adjusting a , where the cursor or pointer tracks the device's movement continuously, as in moving a to hover over a target without activating it. State 2 enables ongoing manipulation or "dragging," where an action persists based on movement, typically initiated by a , such as selecting and relocating an on screen. Task execution flows through transitions between these states to complete pointing actions. For instance, a selection begins in State 0 (device inactive), shifts to State 1 upon contact or movement (position acquisition via pointer tracking), and enters State 2 with a button-down event (initiating or continuous adjustment); the task concludes by returning to State 1 or 0 with a button-up. This state-based progression highlights how pointing devices must handle both discrete events (like button presses) and continuous control to support fluid interactions. The model has significant implications for the design of pointing devices, underscoring the need for mechanisms that facilitate seamless switches, such as buttons on a to toggle between tracking and dragging states. By mapping device capabilities to these states, designers can evaluate suitability for tasks like , ensuring devices provide the necessary semantic levels without ambiguity. For example, devices lacking a clear State 0 (e.g., always-active trackers) may introduce errors in repositioning. A key limitation of the model is its assumption of button-based or discrete input for state transitions, which aligns well with traditional devices like mice but applies less effectively to modern touch or interfaces lacking physical buttons, where or proximity might substitute for state changes. Additionally, it struggles to fully accommodate -sensitive inputs, such as varying stylus force for line thickness, which introduce additional semantic dimensions beyond the three states.

Design Principles and Performance Metrics

Fitts' Law

Fitts' Law is a predictive model in human motor control that describes the time required to move to and select a target area, such as with a on a graphical interface. The core concept posits that the movement time (MT) to acquire a target increases logarithmically with the distance (D) to the target and decreases with the target's width (W), capturing the inherent trade-off between speed and accuracy in aimed movements. This relationship is formalized through the Index of Difficulty (ID), which quantifies the task's complexity based on these spatial factors, enabling comparisons across different pointing scenarios. The law originated from psychological research on human motor behavior, developed by Paul Fitts in 1954 through experiments examining rapid aimed movements, such as reciprocal tapping between two plates. Fitts drew on to frame motor tasks as communication channels, where the precision required for smaller or more distant targets demands greater processing capacity. In the context of human-computer interaction (HCI), the model was adapted by Stuart , William English, and Betty Burr in 1978, who applied it to evaluate pointing performance with devices like the during text selection tasks on early displays. Their work established Fitts' Law as a cornerstone for assessing efficiency in graphical user interfaces. Empirically, the law is supported by a robust body of experiments demonstrating a linear relationship between the Index of Difficulty and movement time, with consistent results across various motor tasks. In Fitts' original studies, participants performed repeated tapping actions under controlled conditions, revealing that MT rises predictably as ID increases due to heightened demands on visuomotor coordination. Subsequent HCI research has validated this linearity in pointing paradigms, confirming the model's applicability to interactive systems where users acquire on-screen targets. Variants of the law account for differences in task structure and dimensionality. The original formulation focused on reciprocal tapping tasks, involving back-and-forth movements between targets to simulate continuous . In contrast, one-shot —common in HCI—involves a single acquisition movement, often yielding slightly different performance parameters but maintaining the core ID-MT relationship. Extensions to two-dimensional () angular targets adjust the model to incorporate directional variability in planar , such as cursor movements on displays, while preserving the law's predictive power. Key implications of Fitts' Law extend to interface design, where it informs strategies to optimize pointing efficiency by modulating target properties to balance rapid acquisition with error minimization. For instance, in dense user interfaces, enlarging effective widths can reduce movement times and error rates without expanding physical space. The model also highlights how mappings, such as control-display , influence perceived target width and overall performance. By prioritizing such principles, designers can enhance in pointing-based interactions.

Mathematical Formulation of Fitts' Law

Fitts' Law is mathematically expressed in its standard form as the linear relationship between movement time (MT) and the index of difficulty () of a pointing task: MT = a + b \cdot [ID](/page/ID) where MT is the average time to acquire the target in seconds, a represents the intercept or fixed time cost associated with initiating and completing the movement (typically around 100-200 ), and b is the indicating the change in time per unit of difficulty (in seconds per bit). The index of difficulty is defined as ID = \log_2 \left( \frac{D}{W} + 1 \right) in bits, with D denoting the distance from the starting position to the center of the target and W the width of the target along the of approach. The additive term +1 ensures that ID remains non-negative even when D < W, addressing cases where the target is closer than its width, such as in homing behaviors. This formulation, known as the Shannon variant, aligns the model with by incorporating the +1 term analogous to Shannon's channel capacity equation, C = B \log_2 (S/N + 1), where movement amplitude substitutes for signal power and target width for noise. An alternative Shannon-inspired variation omits the +1, yielding ID = \log_2 (D / W), which simplifies alignment with pure entropy measures but can produce negative values for small D/W ratios and is less commonly used in practice. The original formulation by Fitts used ID = \log_2 (2D / W), incorporating a factor of 2 to represent the full extent of discriminable alternatives across the movement amplitude, equivalent to the number of target-width units spanning twice the distance (forth and back in reciprocal tasks). For two-dimensional tasks involving angular targets, such as circular layouts on screens, the index extends to ID = \log_2 (2D / W), adjusting for the effective width perpendicular to the radial approach to better predict performance in non-linear pointing. Parameters a and b are empirically estimated via linear regression on experimental data across varying D and W. The throughput, defined as the information processing rate TP = 1/b in bits per second, quantifies device performance; for computer mice, typical values range from 4 to 5 bits/s, reflecting efficient motor control in skilled users. To account for inaccuracy in trials, the nominal target width W is replaced by the effective width W_e = 4.133 \cdot SD, where SD is the standard deviation of endpoint distribution along the approach axis, ensuring models fit observed error rates around 4% without biasing predictions toward perfect accuracy. The derivation of Fitts' Law stems from applied to the human motor system, extending Hick-Hyman's law for choice reaction time, RT = a + b \log_2 (N + 1), where N is the number of alternatives, to continuous aiming by treating movement as selection among discriminable positions. The logarithmic form arises from Weber's law, which posits that the just-noticeable difference in stimulus intensity is proportional to the stimulus itself (\Delta I / I = k), implying a for perceptual-motor resolution; Fitts analogized this to the bits required to specify a point within a W over distance D, yielding approximately \log_2 (D / W) .

Applications of Fitts' Law in UI Design

Fitts' Law guides user interface designers in optimizing target sizes to reduce the index of difficulty (ID) for pointing tasks, thereby decreasing movement times and error rates. The ISO 9241-9 standard, which specifies ergonomic requirements for non-keyboard input devices through Fitts' Law-based evaluations, uses multi-directional pointing tasks to assess performance metrics such as throughput. For touch-based interfaces, Windows User Experience Interaction Guidelines advocate a minimum touch target size of 9 mm (or 40 effective pixels), as smaller dimensions elevate ID, prolonging acquisition times and increasing inaccuracies according to Fitts' Law principles. Apple's for extend this to 44 x 44 CSS pixels on mobile screens, a size empirically derived to minimize errors while fitting grid layouts, effectively lowering ID for thumb-based interactions. Similarly, the recommends at least 10 mm x 10 mm for interactive elements on touchscreens to accommodate average adult finger pads (9-10 mm wide), preventing "fat-finger" errors and supporting faster pointing as predicted by Fitts' Law. Layout strategies informed by Fitts' Law emphasize positioning frequently accessed elements to exploit screen boundaries, treating edges and corners as having "infinite width" to drastically reduce effective distance (D) and ID. For instance, placing menus or buttons along screen edges allows users to "pin" the cursor against the boundary, enabling quicker acquisition without overshooting, a technique validated in human-computer interaction studies. macOS implements this through "hot corners," where moving the cursor to a screen corner triggers actions like exposing the ; this leverages the infinite extent of corners to minimize D to near zero, optimizing for rapid access in line with Fitts' Law. Hierarchical menus further apply the law by structuring navigation to keep cumulative distances short—positioning submenus close to parent items and using progressive disclosure to avoid long traversals—thus reducing overall task time across multiple selections. Evaluation methods using Fitts' Law enable designers to predict and validate efficiency by calculating expected task completion times from ID values, allowing pre-prototype assessments of pointing performance. Throughput metrics, derived from ISO 9241-9 multi-directional tasks, quantify bits per second to compare interface variants objectively. incorporates Fitts' Law by measuring actual movement times and error rates for design alternatives, such as varying pointer shapes (e.g., larger cursors for precision) or acceleration curves, to iteratively refine layouts for higher throughput. Device-specific adaptations of Fitts' Law account for input modality differences to tailor target widths (W) and layouts accordingly. On touchscreens, finger occlusion—where the selecting finger obscures the —necessitates larger W (e.g., 10-14 mm) and offset feedback like target expansion to mitigate visibility issues and maintain low ID, unlike mice which support sub-pixel precision without occlusion. Mice enable finer control for smaller targets due to higher accuracy, but touch interfaces require amplified W to compensate for finger imprecision, as shown in comparative studies where touch throughput lags mice by 20-30% on fine tasks. For accessibility, particularly users with motor impairments, enlarging W beyond standard minima (e.g., to 16-20 mm) and reducing D through edge placement significantly improves pointing success rates, reducing errors by up to threefold compared to standard sizes. Case studies illustrate Fitts' Law's impact on real-world redesigns. The Windows evolution, from Windows 95's bottom-left placement (high ID due to fixed D from taskbar) to Windows 8's full-screen , incorporated edge-aligned tiles to exploit infinite widths, cutting average selection times by optimizing for common paths. Mobile app icon grids, as in iOS home screens, use uniform 60 x 60 point targets spaced to minimize D within thumb reach zones, reducing ID for frequent app launches and improving throughput by 15-20% over denser layouts per usability tests. In virtual reality, ray-casting pointing techniques apply Fitts' Law by scaling virtual target W relative to distance in 3D space; a study on gaze-assisted ray-casting showed 25% faster selections than unassisted methods by effectively lowering ID through hybrid input.

Control-Display Gain

Control-display gain (CD gain) is defined as the ratio of cursor displacement on the screen to the physical movement of the pointing device, often expressed as a unitless where a value of 1.0 represents a 1:1 mapping. High CD gain values amplify cursor movement relative to device input, enabling faster navigation across large screens with minimal physical effort, while low values provide finer control for precise tasks. CD gain is typically measured in units such as pixels per millimeter of device movement or degrees of cursor rotation per degree of device rotation, and it is adjustable through software settings like the pointer speed in Windows, which scales the multiplier from approximately 0.25 to 20. This adjustment allows users to tune the dynamically without changes, though extreme settings can introduce quantization errors if the device's is insufficient. In terms of , high CD reduces the physical distance required for cursor travel, speeding up for coarse movements, but it amplifies hand tremors and small errors, effectively narrowing the usable target width in pointing models like Fitts' law. Conversely, low CD enhances precision for fine adjustments by minimizing overshoot, though it increases the need for clutching—repositioning the device mid-movement—and raises maximum limb speeds, leading to 10-14% slower overall performance in pointing tasks. Empirical studies show that performance degrades markedly below a gain of 4, with error rates rising due to these biomechanical limits. Optimization of CD gain often involves adaptive techniques, such as velocity-based pointer (also called ), where increases with device speed to balance speed and —slow movements near screen edges use low for accuracy, while central fast movements employ higher for . In macOS, this is implemented through a non-linear curve that slows the cursor for deliberate motions and accelerates it for rapid ones, improving times by up to 5.6% for small targets compared to constant . Empirical tuning follows standards like ISO 9241-9, which evaluates devices through controlled tasks to assess 's impact on throughput and comfort, recommending configurations that minimize and error without specifying fixed values. The concept of CD gain emerged in human-computer interaction research in the mid-20th century but prominence in the with studies on input devices, such as Buck's 1980 experiment using joysticks, which demonstrated how varying affects motor performance in one-dimensional relative to width and . Early HCI work at in the and 1990s, including pointer ballistics implementations in Windows, further refined adjustable for graphical interfaces to optimize desktop productivity. In modern contexts, mice exemplify high CD tunability, with 2025 models offering DPI settings up to 44,000, allowing users to select gains from 400 DPI for precision aiming to over 40,000 for rapid sweeps in competitive play.

Categories of Pointing Devices

Motion-Tracking Pointing Devices

Motion-tracking pointing devices determine cursor position through relative , integrating device velocity over time to compute incremental changes in position. This approach contrasts with positioning by focusing on velocity-based , where the rate and of dictate cursor rather than direct spatial . To enhance , these devices often incorporate curves that introduce non-linear responses, allowing slower movements for and faster ones for rapid traversal. The exemplifies this category, initially developed by in 1964 as a mechanical device with a ball that rolled on a surface to detect X-Y motion via wheels. Subsequent innovations include optical mice, commercialized prominently in 1999 with models like Microsoft's Explorer using an LED and camera for surface imaging, and laser mice that employ a for superior tracking on diverse surfaces with higher DPI ratings up to 16,000 or more. These advancements enable greater precision in cursor control, though traditional mice require sufficient desk space for operation, limiting portability in constrained environments. By 2025, trends emphasize wireless connectivity and ultra-high polling rates, such as 8 kHz in models like the Razer Viper V3 Pro, minimizing latency for competitive applications. Trackballs function as inverted mice, with the user manipulating a ball using thumb or fingers to generate relative motion signals. Invented by Ralph Benjamin in 1946 for plotting during , trackballs remain on the desk, reducing the need for arm movement and offering ergonomic benefits by alleviating (RSI) through minimized wrist extension. Joysticks provide analog velocity , where deflection from center translates to cursor speed and direction, originating in simulators for . By the 1970s, they became staples in arcades, supporting planar or tilt inputs for immersive in simulations and environments. The , released by in 2006, uses embedded accelerometers and gyroscopes to track device orientation and motion for gesture-based pointing, enabling intuitive air interactions in gaming. This design fosters immersive experiences by mimicking natural hand movements without surface contact.

Position-Tracking Pointing Devices

Position-tracking pointing devices operate on the principle of absolute mapping, where the cursor position on the screen corresponds directly to the physical position of the or finger on the tracking surface, providing a one-to-one relationship without requiring integration of motion over time. This direct positional fidelity eliminates velocity buildup errors common in relative devices but is inherently limited by the physical size of the tracking area, which constrains the range of movement. These devices align with the absolute control category in Buxton's taxonomy of input devices. Graphics tablets exemplify this category through electromagnetic or acoustic sensing technologies that detect the absolute position of a on the tablet surface. The conceptual origins trace back to early devices like the , invented in 1888 for transmitting handwriting electrically, though modern graphics tablets emerged in the mid-20th century with digitizing surfaces for precise input. , founded in 1983, pioneered commercial electromagnetic resonance () tablets in 1987, enabling battery-free stylus operation via signals from the tablet itself. Many graphics tablets support pressure sensitivity, allowing variation in line thickness based on applied force, which enhances precision for artists and designers. Advantages include high accuracy for detailed work, while drawbacks involve the need for substantial desk space due to larger tablet sizes. Styluses used with position-tracking devices come in active variants, which are battery-powered and emit signals for detection, or passive types that rely on the tablet's field without internal power. These styluses pair with graphics tablets or compatible screens to provide absolute positioning, often supporting advanced features like tilt detection for natural brush strokes and hover functionality for pre-contact previewing. By 2025, such capabilities are standard in high-end models, improving usability in creative applications. Touchpads employ to track the absolute position of one or more fingers on a flat surface, typically integrated into laptops for portable input. Synaptics commercialized capacitive touchpads in the early 1990s, initially for single-touch cursor control, with extensions enabling gestures like pinching to zoom or swiping to scroll. Although the output to the cursor may incorporate relative adjustments for , the core input mechanism captures absolute positions on the pad. Their compact design makes them ideal for , reducing reliance on external . Touchscreens provide direct absolute 2D positioning by detecting contact points on the display itself, allowing users to interact seamlessly with on-screen elements. Resistive touchscreens, developed in the , rely on pressure to complete a between flexible layers, accommodating styluses or gloved fingers but requiring more . Capacitive touchscreens, which sense electrical conductivity from bare fingers, gained prominence with the 2007 launch, supporting for intuitive gestures. Benefits include immersive direct , though drawbacks encompass visual by the hand and potential user fatigue from prolonged arm extension. Effective use of position-tracking devices necessitates to map the physical bounds of the input surface to the screen's , ensuring accurate correspondence between device positions and cursor coordinates. This process compensates for variations in device size and screen dimensions, maintaining precision across different hardware configurations.

Pressure-Tracking Pointing Devices

Pressure-tracking pointing devices generate input signals based on the magnitude and direction of applied or , typically producing a output proportional to the input without requiring any physical of the device. This principle allows for compact designs that integrate seamlessly into limited spaces, such as keyboards, as the device remains stationary while sensors detect deformation from user . In Buxton's of input devices, these fall under control mechanisms, where modulates cursor speed rather than absolute position. A prominent example is the isometric joystick, which employs strain-gauge sensors to measure subtle deflections caused by thumb pressure on a central nub, translating force into cursor velocity. commercialized this approach with the TrackPoint in 1992 for its laptops, positioning the red rubber nub between the G, H, and B keys to enable pointing without hand repositioning. Advantages include enhanced integration for efficient typing-to-pointing transitions and reduced need for gross arm movements, though users often face a steep due to the unintuitive force-based control. Variations include force pads and piezoresistive surfaces, which detect across a flat area using resistive elements that change conductivity under force. These have been applied in controls for precise, multi-axis where constraints demand non-moving interfaces. Performance evaluations reveal high variability in control-display gain for devices, leading to directional biases that can impair accuracy in two-dimensional targeting tasks compared to displacement-based alternatives. This variability ties into broader studies, as sustained contractions increase strain on hand and forearm muscles during prolonged use. Adoption has been most notable in Lenovo's series, where the TrackPoint persists as a signature feature, but mainstream uptake remains limited owing to precision limitations relative to touch-based devices.

Emerging and Other Pointing Devices

Eye-tracking systems represent a hands-free pointing method that uses infrared cameras to monitor pupil and corneal reflections, enabling gaze-based cursor control. Developed by Tobii since its first commercial eye tracker in 2005, these devices have evolved to integrate into consumer laptops, such as those featuring Intel's processors in the 2020s, where AI-enhanced algorithms on neural processing units (NPUs) improve real-time gaze detection. Pros include accessibility for users with motor impairments, allowing seamless navigation without physical effort; however, challenges encompass frequent calibration needs, variable accuracy due to lighting or head movement, and privacy concerns from constant gaze monitoring. Gesture recognition pointing devices leverage depth-sensing cameras to capture 3D hand poses and movements for intuitive control, extending beyond traditional touch inputs. Microsoft's Kinect, released in 2010 for the , pioneered this with its projector and RGB camera, enabling full-body gesture tracking for and interfaces, achieving recognition accuracies up to 95% in controlled environments. In the 2020s, Ultraleap's hand-tracking modules, building on technology, support precise 6-degree-of-freedom (6DoF) finger articulation in VR/AR setups, used in applications like virtual manipulation with sub-millimeter precision. Advantages lie in natural, controller-free interactions; drawbacks include sensitivity to occlusions and environmental interference, often resulting in latency exceeding 50ms during complex poses. VR/AR controllers incorporate inside-out camera systems for hand tracking, combining motion sensing with gesture interpretation to simulate pointing in immersive environments. The Oculus Quest 2, launched in 2020, introduced software-based hand tracking via its front-facing cameras, allowing pinch and grab gestures for menu navigation without physical controllers. Meta's Quest 3, released in 2023, advanced this with dual RGB cameras and improved AI models, supporting 6DoF tracking for more reliable pointing in , reducing errors by up to 40% compared to predecessors. These systems excel in spatial interaction but face issues like limited field-of-view tracking and higher latency in dynamic scenes, impacting precision for fine pointing tasks. Brain-computer interfaces (BCIs) enable direct neural control of pointing devices by decoding brain signals for cursor movement, bypassing physical inputs entirely. Neuralink's prototypes, with the first human implant in January 2024, allow quadriplegic users to maneuver cursors at speeds surpassing prior BCI records, achieving bits-per-second (BPS) rates over 8 for thought-based navigation. By 2025, second-generation implants expanded to assistive technologies, emphasizing for severe disabilities. Benefits include unparalleled independence for immobilized users; however, invasiveness requires surgical implantation, and signal noise can introduce latency and accuracy variability, with ongoing challenges in long-term stability. Other innovative pointing devices include gyro-based air mice, which use 6-axis inertial sensors for wireless, mid-air cursor control in presentations or smart TVs, offering freedom from surfaces but prone to drift over extended use. Foot pedals serve needs, functioning as alternative mice via pressure-sensitive switches that map foot movements to cursor actions, as seen in devices like the XK-3 USB pedals, which support programmable for users with upper-limb impairments. In 2025 trends, AI-predictive pointing emerges in smart glasses, where anticipates user intent from partial gestures or gaze to refine cursor placement, enhancing efficiency in interfaces like those in Meta's models. Across these emerging devices, common challenges involve balancing accuracy—often below 1° for or 1cm for gestures—with low under 20ms for fluid , while future human-computer (HCI) aims to fuse inputs like eye and neural signals for robust, context-aware pointing.

References

  1. [1]
    Some Milestones in Computer Input Devices: An Informal Timeline
    In the early 1950s, Robert Everett developed a light gun to read the position of a dot on the screen of the Whirlwind computer for diagnostic purposes.Missing: authoritative | Show results with:authoritative
  2. [2]
    Computer Terminology - Input and Output
    Aug 29, 2016 · Typical pointing devices are: mouse, trackball, touch pad, trackpoint, graphics tablet, joystick, and touch screen.
  3. [3]
    [PDF] Input and Interaction 1. Interaction 2. Input Devices
    Physical input devices include pointing devices (like joysticks) and keyboard devices. Pointing devices can be relative or absolute position devices.
  4. [4]
    Testing pointing device performance and user assessment with the ...
    Testing pointing device ... ISO/DIS 9241-9 Ergonomic Requirements for Office Work with Visual Display Terminals, Nonkeyboard Input Device Requirements, Draft ...
  5. [5]
  6. [6]
    [PDF] Chapter 9 - FAA Human Factors
    This section provides rules for keyboards, function keys, pointing devices, and some alternative input devices. The advantages and.
  7. [7]
    [PDF] An Introduction to Human Computer Interaction - University of Sussex
    Oct 16, 1995 · Human-computer interaction (HCI) is the study of the ways people interact with and through computers. It grew out of work on human factors ...
  8. [8]
    None
    Below is a merged summary of pointing devices from *Designing the User Interface* (4th Edition) by Ben Shneiderman, consolidating all provided segments into a comprehensive response. To retain all details efficiently, I will use a table in CSV format for key characteristics and supporting details, followed by a narrative summary that integrates definitions, descriptions, purposes, and additional information. This approach ensures maximum density while maintaining readability and completeness.
  9. [9]
    [PDF] New Human-Computer Interaction Techniques
    An interaction technique is a way of using a physical input/output device to per- form a generic task in a human-computer dialogue. It represents an abstraction.
  10. [10]
    35 Interface Innovations that Rocked Our World - Xerox
    The graphic user interface (GUI)​​ Building on Doug Engelbart's vision, Alan Kay and a team of Xerox PARC scientists were the first to take computers beyond ...
  11. [11]
    The Xerox PARC Visit
    Thus PARC's own work drew on-- and in many cases, significantly improved on-- already-existing work in interface design and input devices. For the rest of ...Missing: pointing | Show results with:pointing
  12. [12]
    About the Computer Mouse - Daily Tech News Show
    Sep 21, 2023 · The first trackball was developed in 1946 as an improvement for fire-control radar plotting systems. Military stuff. The Comprehensive Display ...Missing: history | Show results with:history
  13. [13]
    Firsts: The Mouse - Doug Engelbart Institute
    Doug Engelbart invented the computer mouse in the early 1960s in his research lab at Stanford Research Institute (now SRI International).
  14. [14]
    What device did Douglas Engelbart invent? - Science | HowStuffWorks
    So did Xerox's Palo Alto Research Center, which in 1973 paired a three-button, trackball mouse with the Alto, the first small computer with a graphical user ...
  15. [15]
    A Brief History of Touchscreen Technology: From the iPhone to Multi ...
    1965 - E.A. Johnson: First finger-driven touchscreen · 1970 - Dr. G. Samuel Hurst: A new type of sensor called the Elograph · 1984 - Bob Boie: First multitouch ...
  16. [16]
    The first capacitative touch screens at CERN - CERN Courier
    Mar 31, 2010 · By 1977 the capacitative touch screen was already available commercially and being sold to other users within CERN and to other research ...
  17. [17]
    Tablets, Mice, and Trackpads: The evolution of Apple pointing devices
    Mar 1, 2013 · Multi-touch mania continued in 2009 with the introduction of the Magic Mouse, which replaced the wireless Mighty Mouse in Apple's product ...
  18. [18]
    What is a TrackPoint (pointing stick)? | Definition from TechTarget
    Jun 2, 2023 · A TrackPoint, also called a pointing stick, is a cursor control device found in Lenovo ThinkPad notebook computers. It is located in the middle of the keyboard.
  19. [19]
    About Cirque Corporation
    So Cirque invented the GlidePoint® Trackpad that has now proven to be a very ... - 1994 | First notebook PC with a trackpad, using Cirque technology.
  20. [20]
    Celebrate Wacom's 40th Anniversary
    Wacom's SD Series of pen tablets introduce crucial tech, pressure sensitivity. This device is the ancestor of all modern Wacom pen tablets and pen displays.<|separator|>
  21. [21]
    Mouse History, 1978 to 1999 - Low End Mac
    Jan 3, 2025 · The Early Wireless Era​​ With the Logitech MouseMan Cordless, introduced in 1991, Logitech shipped the first wireless mouse. For the first time, ...
  22. [22]
    Evolution of the Console Controller – Nintendo Wii Remote (2006)
    The Wii Remote, better known as the Wiimote, is the primary controller for Nintendo's Wii console. In addition to conventional inputs, the Wii Remote uses ...
  23. [23]
    The Best Ergonomic Mouse of 2025: Mice Reviews - RTINGS.com
    May 8, 2025 · The best ergonomic mouse we've tested is the Logitech MX Master 3S. It's the latest entry in Logitech's MX Master lineup and shares the same ...Missing: trends DPI AI- tracking
  24. [24]
    The Best Computer Mice We've Tested for 2025 | PCMag
    Our overall current favorite mouse for productivity is Logitech's MX Master 4, and for gaming, Razer's DeathAdder V3 HyperSpeed. But we've gathered the best ...
  25. [25]
    The best gaming mouse in 2025: I've been a PC gamer ... - TechRadar
    Sep 30, 2025 · I've gone hands-on with dozens of gaming mice, including models from Razer, Logitech, SteelSeries, HyperX, and more. The best gaming mouse ...
  26. [26]
    [PDF] Taxonomies of Input - Bill Buxton
    Jan 4, 2009 · Input taxonomies include virtual devices (GKS), generic transactions (Foley, Wallace & Chan), and Buxton's based on human motor/sensory systems.
  27. [27]
    Lexical and pragmatic considerations of input structures
    Lexical and pragmatic considerations of input structures. Author: William Buxton. William Buxton. University of Toronto, Toronto, Ontario, Canada. View Profile.
  28. [28]
    A THREE-STATE MODEL OF GRAPHICAL INPUT*+
    Buxton (1983) introduced a taxonomy of input devices that was more rooted in the human motor/sensory system. The concern in this case was the ability of ...
  29. [29]
    [PDF] 1 A THREE-STATE MODEL OF GRAPHICAL INPUT*+1 - Bill Buxton
    Buxton (1983) introduced a taxonomy of input devices that was more rooted in the human motor/sensory system. The concern in this case was the ability of ...Missing: velocity | Show results with:velocity
  30. [30]
    Fitts' Law as a Research and Design Tool in Human-Computer ...
    It follows that the relationship between MT and ID is linear. His ... Fitts calculated IP directly by dividing ID by MT for each experimental condition.<|separator|>
  31. [31]
    Extending Fitts' law to two-dimensional tasks - York University
    Fitts' law, a one-dimensional model of human movement, is commonly applied to two-dimensional target acquisition tasks on interactive computing systems.Missing: tapping shot
  32. [32]
  33. [33]
    Touch Targets on Touchscreens - NN/G
    May 5, 2019 · Interactive elements must be at least 1cm × 1cm (0.4in × 0.4in) to support adequate selection time and prevent fat-finger errors.
  34. [34]
  35. [35]
    How-to: Put your Mac's screen corners to good use - TNW Apple
    Oct 25, 2011 · What are Hot Corners? According to the Fitts's Law, buttons placed on the edges of a screen are easier to hit than anywhere else on it ...Missing: placement design
  36. [36]
  37. [37]
    Fitts' Law - York University
    Throughput values range from about 1 bit/s for lip input to about 7 bits/s for touch input. Mouse values are typically in the 4-5 bits/s range. Calculation of ...Fitts' Experiments · Adjustment For Accuracy · Calculation Of Throughput
  38. [38]
    Fitts's Law and Its Applications in UX - NN/G
    Jul 31, 2022 · Fitts's law clearly says that people will be faster to click, tap, or hover on bigger targets. Not only that, but error rates go down as target sizes increases.Missing: original | Show results with:original
  39. [39]
    Comparison of gestural, touch, and mouse interaction with Fitts' law
    Touch interaction performed comparably with mouse interaction although suffered with smaller targets due to occlusion and the impreciseness of a finger compared ...
  40. [40]
  41. [41]
    Former Windows user experience chief has issues with the ... - ZDNET
    Sep 1, 2022 · The designer who helped shape Windows 8 Start tiles takes a swipe at the design of Windows 11's Start menu ... Fitts' Law to make it ideally easy ...
  42. [42]
    A Fitts' Law Study of Gaze-Hand Alignment for Selection in 3D User ...
    Our experimental design is based on the ISO 9241-9 standardised Fitts' Law study [22], which allows us to compare findings across studies. In the following ...
  43. [43]
    [PDF] The Impact of Control-Display Gain on User Performance in Pointing ...
    Jul 1, 2008 · The maximum CD gain that can be used without quantization problems is calculated by dividing the resolution of the pointing device by the.
  44. [44]
    Effect of Control-Display Gain and Mapping and Use of Armrests on ...
    This experiment aims to understand how control-display (CD) parameters such as gain and mapping as well as the use of armrests affect gesture accuracy in ...
  45. [45]
    Change mouse settings - Microsoft Support
    In the Bluetooth & devices > Mouse window, use the slider next to Mouse pointer speed to set the mouse pointer speed. Make the pointer more accurate. To make ...Missing: gain | Show results with:gain
  46. [46]
    The Impact of Control-Display Gain on User Performance in Pointing ...
    Sep 10, 2008 · We found that low levels of CD gain had a marked negative effect on performance, largely because of increased clutching and maximum limb speeds.
  47. [47]
    [PDF] ISO 9241-9 - iTeh Standards
    Feb 15, 2000 · ISO 9241-9 specifies ergonomic requirements for office work with VDTs, specifically for non-keyboard input devices.Missing: guidelines | Show results with:guidelines
  48. [48]
    Motor performance in relation to control-display gain and target width
    Five groups of subjects performed a target alignment task using a joystick-oscilloscope system with different control-display gains.
  49. [49]
  50. [50]
  51. [51]
    (PDF) Motion Tracking and Detection System Based on Motion Sensor
    Apr 22, 2022 · The principle of trajectory tracking is that the three-dimensional velocity and displacement can be obtained by integrating the acceleration.
  52. [52]
    [PDF] Dynamics of Pointing with Pointer Acceleration - Hal-Inria
    Feb 26, 2018 · Abstract. In this paper we investigate the dynamics (including velocities and accelerations) of mouse pointing when Pointer Acceleration ...
  53. [53]
    How Optical Mice Came To Dominate Input Devices - Tedium
    May 19, 2024 · But the real game-changer came in 1999, when it introduced the IntelliMouse Explorer, which was both the first USB mouse Microsoft made and the ...Missing: date | Show results with:date
  54. [54]
  55. [55]
    The 2025 Wireless Mouse Review: My Top Picks for Maximum ...
    Nov 1, 2025 · 1. Razer Viper V3 Pro ; True 8K wireless polling rate reduces latency to 0.125 milliseconds, offering a measurable competitive advantage.
  56. [56]
    British and Canadians Invent the Trackball - History of Information
    In 1946 English engineer Ralph Benjamin Offsite Link invented the first trackball Offsite Link called roller ball, Offsite Link which was patented in 1947.Missing: advantages ergonomic RSI stationary
  57. [57]
    Gizmos and Gadgets - VCU RRTC
    Trackballs have also been gaining popularity in the general population because they offer ergonomic relief from Repetitive Strain Injury (RSI) and Carpal Tunnel ...
  58. [58]
    The History of Flight Simulation and the Evolution of Flight Simulators
    Oct 29, 2021 · We witness the integration of the first analog computers into simulators, beginning at the end of World War II and into the 1950's.Missing: joystick pointing 1970s 2D 3D
  59. [59]
    Joystick History | PDF | Video Games - Scribd
    Jan 3, 2025 · The first significant shift toward the joystick as a gaming input device occurred in the 1960s and 1970s with the rise of arcade video games.Missing: 2D | Show results with:2D
  60. [60]
    What the difference between relative and absolute input (mouse vs ...
    May 3, 2013 · Joystick "relative" movements are used in situations where you want to use the tilt of the joystick as a velocity or acceleration measurement.
  61. [61]
  62. [62]
    Wii Remote - Wikipedia
    The Wii Remote, colloquially known as the Wiimote, is the primary game controller for Nintendo's Wii home video game console. An essential capability of the ...<|separator|>
  63. [63]
    Of gyroscopes and gaming: the tech behind the Wii MotionPlus
    Aug 25, 2008 · A small controller add-on that vastly improves the remote's ability to recognize motion and, ultimately, to power more involving and immersive games.
  64. [64]
    [PDF] Leap Motion ControllerTM - Inforabreu
    The Leap Motion Controller™ is an optical hand tracking module that captures the movement of users' hands and fingers so they can interact naturally with ...Missing: principle | Show results with:principle<|control11|><|separator|>
  65. [65]
    Ultraleap Hand Tracking Overview
    Ultraleap Hand Tracking Cameras have two infrared cameras and multiple LEDs. The LEDs illuminate your hands with infrared light invisible to the human eye.Missing: optical | Show results with:optical
  66. [66]
    [PDF] Adaptive Pointing - Design and Evaluation of a Precision Enhancing ...
    Absolute pointing devices are characterized by a position-to-position mapping. The user expects that the cursor is in line with the device e.g. laser-pointer.
  67. [67]
    [PDF] Absolute vs. Relative Direct Pen Input
    An absolute mapping - where there is a one-to-tone correspondence between the pen and cursor positions - was found to be superior to a relative mapping - where.Missing: principle | Show results with:principle
  68. [68]
    Telautograph - Engineering and Technology History Wiki
    Oct 3, 2023 · A device designed to transmit handwriting over electrical telegraph signals. The telautograph took input from a stylus connected to a series of mechanical ...Missing: 1957 | Show results with:1957
  69. [69]
    How Wacom Tablets Work - Computer | HowStuffWorks
    The sensor board of the tablet has a magnetic field, and the pen produces its own magnetic field -- and energy -- from it. (That's why no batteries or power ...
  70. [70]
    Wacom Graphics Tablet History: Touch Computing's Early Icon
    Sep 21, 2017 · How the graphics tablet, most notably produced by Japanese firm Wacom, helped shape our multitouch-friendly world—even if that shaping took ...
  71. [71]
    Choosing a Stylus for Your Tablet: Universal or Specific? | XPPen
    Oct 22, 2025 · Bluetooth Styluses. Capacitive styluses, passive EMR styluses, and active EMR styluses can all be divided into wired and wireless versions.
  72. [72]
    The Best Stylus Pen of 2025 | Tested & Rated - Tech Gear Lab
    Rating 4.8 · Review by Tatyana GrechinaMar 5, 2025 · Our favorite stylus pen overall is the Apple Pencil. The artwork and precision are unmatched by the competition, and it's sensitive to pressure while offering ...Missing: hover | Show results with:hover
  73. [73]
    [PDF] Synaptics TouchPad Interfacing Guide - IsDaMan.com
    Oct 27, 1998 · This guide describes how computers and other hosts interface to the Synaptics TouchPad. The first section describes the TouchPad generally, ...<|control11|><|separator|>
  74. [74]
    How IT works: Trackpad technology and multi-touch support
    Jan 19, 2017 · Capacitive trackpads are more common in laptops nowadays and were first commercialized by Apple in 1994 with its PowerBook 500 series. Prior to ...Missing: 1990s | Show results with:1990s
  75. [75]
    A Brief History Of Touchscreen Technology: From The IPhone To ...
    Jul 20, 2022 · This paper explains how touchscreen technology operated through pictures of prototypes and diagrams. Moreover, in 1969, E.A. Johnson was granted ...
  76. [76]
    Touchscreen Types, History & How They Work - Newhaven Display
    Apr 11, 2023 · First stylus touchscreen. 1960. E.A. Johnson UK Royal Radar Establishment, First finger driven touchscreen. 1965. Dr. Samuel Hurst Elographics ...
  77. [77]
    Absolute Positioning - Livescribe
    The Paper Tablet works as an absolute positioning device. This means that fixed coordinates on your notepad map to fixed coordinates on your computer screen.Missing: tracking principle
  78. [78]
    Industrial Force Sensing Resistor Pointing Device - HP-DT-FSR - ikey
    The Force Sensing Resistor is an upgraded pointing device that offers improved sensitivity and increased durability.Missing: piezoresistive | Show results with:piezoresistive
  79. [79]
    [PDF] Performance of Rolling Ball and Isometric Joystick on a 2-D Target ...
    Control/display Gain is defined as the ratio of system output to system input and this is typically taken to mean the ratio of control movement to display ...
  80. [80]
    Sustained fatigue assessment during isometric exercises with time ...
    Dec 1, 2020 · The effect of sustained fatigue during an upper limb isometric exercise is presented to investigate a group of healthy subjects with ...
  81. [81]
    An Evaluation of Two Input Devices for Remote Pointing
    This study has shown that two representative devices for remote pointing performed poorly in comparison to a standard mouse.
  82. [82]
    A new era for eye tracking with AI algorithms on NPU - Tobii
    Dec 16, 2024 · By supporting AI algorithms that can run across CPU, Intel NPU, and Edge AI NPUs, Tobii eye tracking offers OEMs and ISVs several key benefits:.
  83. [83]
    Assessing and Mitigating the Privacy Implications of Eye Tracking on ...
    Aug 23, 2025 · Today, eye trackers can be found integrated into Virtual and Augmented Reality headsets [3, 5, 14, 20, 85], smartglasses [68], and laptops [98].Missing: pointing | Show results with:pointing
  84. [84]
    Body Part Recognition and the Development of Kinect - Microsoft
    Jul 16, 2014 · Late 2010, Microsoft launched Xbox Kinect, an amazing new depth-sensing camera and a revolution in gaming where your body movements allow you to control the ...
  85. [85]
    Getting Started - Ultraleap documentation
    The Leap Motion Controller captures the movements of your hands with unparalleled accuracy and near–zero latency. Our high–performance Stereo IR 170 (formerly ...Tabletop Camera Set Up for... · Hyperion Overview · Camera Set Up for XR HeadsetMissing: 2020s | Show results with:2020s
  86. [86]
    Exploring Visualizations for Precisely Guiding Bare Hand Gestures ...
    May 11, 2024 · Due to the limitations of optical tracking systems, hand-tracking errors are unavoidable, which may lead to a hand gesture recognition error ...
  87. [87]
    Oculus Quest is adding hand tracking in 2020 - PC Gamer
    Sep 25, 2019 · The upcoming hand tracking update will enable natural interaction through hand and finger movements, without the need for a controller, external ...
  88. [88]
    Crank up Hand Responsiveness and Unlock New Gameplay with ...
    Hand Tracking 2.2 delivers major latency reduction on Meta Quest, including up to 40% latency reduction in typical usage and up to 75% during fast...
  89. [89]
    A Taxonomy and Systematic Review of Gaze Interactions for 2D ...
    Jul 11, 2025 · A key challenge in gaze interaction design is the inherent limitation of gaze input accuracy and precision. Accuracy reflects the offset between ...
  90. [90]
    PRIME Study Progress Update - Neuralink
    ArticleApril 12, 2024. PRIME Study Progress Update. In January, we conducted the first human implantation of our brain-computer interface (BCI).
  91. [91]
    UPDATE: Neuralink's Human Tests Progress; Critics Persist
    Neuralink's experimental device aims to give users a way to communicate directly with a computer using only their thoughts—a technology known as brain-computer ...
  92. [92]
    Amazon.com: Air Mouse - with Voice, 6-Axis Gyroscope, Bluetooth ...
    【Air Mouse & 6-Axis Gyroscope】Say goodbye to traditional remotes with this all-in-one device that combines a mouse, keyboard, and remote control.
  93. [93]
  94. [94]
    Meta Connect 2025: AI Glasses Make A Mark - Forrester
    Sep 17, 2025 · Meta's Ray-Ban Display glasses combine AI and AR into a new computing platform controlled with your wrist.Missing: predictive pointing<|control11|><|separator|>