Fact-checked by Grok 2 weeks ago

EEG analysis

Electroencephalography (EEG) analysis encompasses the processing, interpretation, and modeling of electrical signals generated by neuronal activity in the , captured non-invasively through electrodes placed on the . These signals reflect synchronized postsynaptic potentials from large populations of neurons, offering high on the order of milliseconds but limited due to signal attenuation by the and . Pioneered by in the , EEG provides a safe, portable, and cost-effective method for monitoring function, distinguishing it from other techniques like fMRI or . The core of EEG analysis involves several key steps to handle the inherent challenges of noisy, non-stationary signals contaminated by artifacts from eye movements, muscle activity, or environmental interference. Preprocessing typically includes filtering (e.g., bandpass 1–40 Hz), artifact removal via techniques such as (ICA) or transforms, and segmentation into epochs for event-related analyses. Feature extraction follows, employing methods like power spectral analysis (e.g., or for frequency-domain insights), time-frequency decompositions (e.g., or empirical mode decomposition for non-stationary dynamics), connectivity measures (e.g., phase-locking value or to assess interactions), and source localization (e.g., low-resolution electromagnetic to estimate intracranial origins). Advanced approaches increasingly integrate , including support vector machines, convolutional neural networks, and models, to classify patterns with high accuracy. EEG analysis finds broad applications in clinical and , enabling the diagnosis and study of disorders such as (e.g., detection and focus localization), (e.g., via spectral power alterations), and sleep disturbances. In research, it elucidates cognitive processes like , , and , while supporting brain-computer interfaces for rehabilitation in conditions like or . Ongoing advancements in high-density arrays and computational algorithms continue to enhance its precision, bridging gaps in and expanding its utility in real-time monitoring and .

Fundamentals of EEG

Signal Acquisition and Preprocessing

Electroencephalography (EEG) signal acquisition begins with the placement of electrodes on the scalp to detect electrical potentials generated by neuronal activity. The standard 10-20 international system, established by the International Federation of Societies for and , divides the into 10% or 20% intervals along anatomical landmarks such as the and inion, enabling consistent electrode positioning across 19 to 21 channels for routine recordings. For higher spatial resolution, high-density arrays extend to 128 or 256 channels, using geodesic or evenly spaced configurations to map brain activity more precisely, as demonstrated in systems like the Geodesic EEG System. Hardware components essential for EEG acquisition include electrodes connected to differential amplifiers, which boost weak scalp signals (typically 1-100 μV) while rejecting common-mode noise, followed by analog-to-digital converters (ADCs) that digitize the amplified signals. Sampling rates generally range from 250 to 1000 Hz to capture EEG frequencies up to 100 Hz without , with a minimum of 256 Hz recommended for clinical standards to ensure adequate . Common artifacts arise during acquisition, including eye blinks that produce high-amplitude frontal deflections due to corneo-retinal dipole shifts and muscle activity (electromyogram) that introduces broadband high-frequency noise from nearby contractions. Preprocessing prepares raw EEG data for analysis through a standardized . Initial filtering applies a at 0.5 Hz to remove slow drifts and DC offsets, a at 70 Hz to attenuate high-frequency noise, and a notch filter at 50 or 60 Hz to eliminate power-line interference. Artifact removal often employs (ICA), a blind source separation technique that decomposes the observed multichannel signals x into independent components y via an unmixing W, such that y = Wx, by maximizing non-Gaussianity through approximation, J(y) \approx H(\mathbf{v}_\mathrm{gauss}) - H(y), where H denotes and \mathbf{v}_\mathrm{gauss} is a Gaussian variable with the same variance as y. Components corresponding to artifacts, such as eye blinks, are then subtracted from the data. Re-referencing follows, commonly to the average of all scalp electrodes (common average reference) to reduce reference bias and enhance . Epoching segments continuous data into trial-based windows aligned to events or stimuli, typically 200-800 ms in duration, while baseline correction subtracts the mean pre-stimulus activity (e.g., -200 to 0 ms) from each epoch to normalize voltage fluctuations. Specific challenges in acquisition include achieving low electrode-skin impedance, ideally below 5 kΩ, to minimize signal and pickup, often verified using saline or conductive paste with wet electrodes that maintain stable contact through gel-mediated ion conduction. Dry electrodes, lacking gel, yield higher impedances (up to 100 kΩ) but enable quicker setup and are preferred for non-invasive applications, though they require active amplification to compensate for reduced signal quality. EEG systems, advanced since the with and wearable designs, facilitate recording using compact, battery-powered headsets with integrated amplifiers, supporting real-world applications despite increased susceptibility to motion artifacts.

Key Characteristics of EEG Signals

Electroencephalography (EEG) signals represent the electrical activity generated by synchronized postsynaptic potentials in pyramidal neurons of the , first recorded in humans by in 1924 using a on a patient with a defect. Berger identified rhythmic oscillations known as , with a of approximately 10 Hz and around 100 μV, which he linked to mental activity and observed to attenuate upon eye opening. This discovery laid the foundation for EEG, evolving from qualitative visual inspection in the early decades to quantitative analysis in the 1960s, when techniques enabled spectrum quantification for more objective assessment of brain rhythms. A defining feature of EEG signals is their organization into distinct frequency bands, each associated with specific brain states and exhibiting characteristic amplitudes and spatial distributions across the . These bands include delta (0.5–4 Hz), prominent during and regeneration with amplitudes up to 100 μV and dominance in frontocentral regions; theta (4–7 Hz), linked to drowsiness, , and , also up to 100 μV and frontal in distribution, particularly in children; alpha (8–12 Hz), indicative of relaxed with eyes closed, up to 100 μV and posteriorly dominant (occipital areas); beta (13–30 Hz), associated with active and , typically 10–20 μV and frontal-central; and gamma (>30 Hz, often up to 80 Hz), involved in high-level processing like sensory integration, up to 100 μV with widespread but task-dependent distribution. Overall, EEG amplitudes range from 1 to 100 μV, reflecting the weak, surface-recorded nature of these signals. The following table summarizes these bands for clarity:
BandFrequency (Hz)Associated States/ActivitiesAmplitude (μV)Spatial Distribution
Delta0.5–4Deep sleep, healingUp to 100Frontocentral
Theta4–7Drowsiness, meditation, anxietyUp to 100Frontal (esp. in children)
Alpha8–12Relaxed wakefulness (eyes closed)Up to 100Occipital/posterior
Beta13–30Active , 10–20Frontal and central
Gamma>30Sensory integration, learningUp to 100Widespread, task-dependent
EEG signals are often contaminated by artifacts, which must be distinguished from neural activity based on their origins, frequencies, and s. Physiological artifacts arise from bodily sources, including electrooculogram (EOG) from eye blinks and movements (0.5–15 Hz, high overlapping / bands), electrocardiogram (ECG) from cardiac pulses (∼1.2 Hz, periodic waveform up to 40 Hz overlapping low frequencies), and electromyogram (EMG) from muscle contractions (20–500 Hz, high in /gamma ranges). Non-physiological artifacts stem from external or technical issues, such as electrode drift (low-frequency <1 Hz gradual changes, mimicking slow waves) and environmental noise (e.g., 50/60 Hz line interference, variable often higher than neural signals). These artifacts typically exhibit greater s and frequency overlaps with neural bands, complicating interpretation, though preprocessing filters can aid mitigation. EEG signals exhibit quasi-stationarity, meaning their statistical properties remain approximately constant over short epochs (∼0.25 seconds) but vary over longer times due to dynamic brain processes like metastable state transitions. This non-stationarity results in time-varying spectra, where frequency content shifts with cognitive or physiological changes, rendering the signals nonlinear and noisy with inherently low signal-to-noise ratios (SNR) that challenge analysis. Ergodicity, the interchangeability of time and ensemble averages, is limited in non-stationary EEG, as long-term averages may not represent short-term dynamics. Volume conduction profoundly influences EEG by allowing currents from neural sources to spread through head tissues, with the skull's high resistivity (10–100 times that of brain tissue) and scalp's filtering effects severely attenuating and blurring signals. This leads to poor spatial resolution, often limited to several centimeters even with dense electrode arrays (e.g., 7 cm minimum error with 32 channels), as instantaneous activity from one source can appear widespread on the scalp. Consequently, source localization faces the ill-posed inverse problem, requiring assumptions about source models and individual head geometry to estimate origins, with errors exacerbated by variable skull thickness.

Core Analysis Methods

Time-Domain Methods

Time-domain methods in EEG analysis characterize the temporal structure, amplitude variations, and waveform morphology of signals directly in their raw form, without frequency decomposition. These approaches emphasize statistical summaries and event-locked responses to quantify signal strength, variability, and transient dynamics. They are foundational for studying evoked brain activity and basic signal properties, offering straightforward insights into how EEG evolves over time. Basic statistical measures provide essential descriptions of EEG signal characteristics. Mean amplitude calculates the average value across samples, indicating baseline activity levels, while variance assesses the degree of fluctuation around this mean, reflecting signal stability. Root mean square (RMS) evaluates overall signal power or magnitude, computed as \RMS = \sqrt{\frac{1}{N} \sum_{i=1}^{N} x_i^2}, where x_i denotes individual signal samples and N is the total number of samples; this metric is particularly useful for comparing signal intensities across channels or conditions. These statistics are computationally efficient and serve as preprocessing steps to normalize or baseline-correct data before more advanced analyses. Event-related potentials (ERPs) capture brain responses synchronized to specific stimuli by averaging multiple time-locked EEG epochs, which suppresses background noise and isolates evoked components. This trial-averaging technique enhances signal detectability, with the resulting waveform revealing peaks and troughs corresponding to cognitive or sensory processing stages. A key example is the , a positive deflection peaking at approximately 300 ms post-stimulus in response to infrequent or decision-relevant events, such as in oddball tasks; it is maximal over parietal scalp regions (e.g., Pz electrode). Grand averaging—pooling ERPs across subjects—facilitates identification of group-level patterns, while topographic mapping visualizes the spatial distribution of amplitude and latency across electrodes, aiding in source localization inferences. These methods excel in high temporal precision for probing perceptual and attentional processes. Autocorrelation quantifies periodicities and self-similarities within a single EEG channel by correlating the signal with its lagged version, using the formula R(\tau) = \frac{1}{N} \sum_{t=1}^{N-\tau} x(t) x(t+\tau), where \tau is the time lag and x(t) are signal values; peaks in the autocorrelation function highlight rhythmic or repeating patterns. Cross-correlation extends this to pairs of channels, measuring their temporal alignment and synchrony to infer functional connectivity, such as coordinated activity between hemispheres. These correlation techniques are applied to detect subtle temporal relationships in resting-state or task-evoked EEG, with early implementations tracing back to analyses of epileptic activity and inter-channel dependencies. Segmentation divides continuous EEG recordings into discrete trials or epochs aligned to experimental events, enabling focused analysis of event-related changes in paradigms like sensory evoked responses. Trial-based approaches measure latencies—the time intervals from stimulus onset to component peaks—and employ peak detection algorithms to automatically identify maxima or minima within predefined windows, accounting for variability across trials. These techniques support quantitative assessment of response timing and amplitude, crucial for developmental or clinical studies where precise temporal markers indicate processing efficiency. The primary advantages of time-domain methods lie in their computational simplicity, requiring minimal resources, and their interpretability, as they directly reflect waveform shapes and transient events without abstraction. They are especially suited for real-time applications and initial signal characterization. Limitations include their inability to capture frequency-specific oscillations, such as alpha rhythms during relaxation, potentially missing broader neural dynamics.

Frequency-Domain Methods

Frequency-domain methods in EEG analysis involve transforming the time-domain signal into its frequency components to quantify oscillatory activity, which is essential for identifying rhythmic patterns associated with brain states. The Fourier Transform (FT) is a foundational technique that decomposes the EEG signal into sinusoidal components of varying frequencies, amplitudes, and phases, enabling the examination of periodic neural activity. The discrete Fourier Transform (DFT) for a finite sequence of N samples x(n) is given by: X(k) = \sum_{n=0}^{N-1} x(n) e^{-j 2\pi k n / N}, where k = 0, 1, \dots, N-1 indexes the frequency bins, and j is the imaginary unit. In practice, the computationally efficient Fast Fourier Transform (FFT) algorithm is used to approximate the DFT, reducing the complexity from O(N^2) to O(N \log N), making it feasible for real-time EEG processing. To mitigate spectral leakage—where energy from one frequency spreads to adjacent bins due to finite signal length—windowing functions such as the Hanning window are applied, which taper the signal edges to improve frequency resolution. The Power Spectral Density (PSD) extends FT/FFT by estimating the distribution of signal power across frequencies, providing a measure of oscillatory strength in specific bands like delta (0.5–4 Hz), theta (4–8 Hz), alpha (8–13 Hz), beta (13–30 Hz), and gamma (>30 Hz). , a widely adopted approach, computes the PSD by dividing the signal into overlapping segments, applying a to each, performing FFT, and averaging the to reduce variance and bias compared to a single . This yields relative band powers, such as alpha power between hemispheres, which has been linked to emotional where greater left-hemisphere alpha indicates reduced and negative affect. Coherence and phase synchrony assess functional by measuring the consistency of relationships between EEG channels at given frequencies, revealing inter-regional communication. The magnitude-squared between channels x and y is defined as: C_{xy}(f) = \frac{|S_{xy}(f)|^2}{S_{xx}(f) S_{yy}(f)}, where S_{xy}(f) is the cross-spectral density and S_{xx}(f), S_{yy}(f) are auto-spectral densities; values range from 0 (no synchrony) to 1 (perfect linear relationship). Phase synchrony focuses on differences, often using metrics like phase-locking value, to detect non-volume-conducted interactions without amplitude influence. Historically, frequency-domain analysis gained prominence in the 1960s with early neurofeedback applications, such as Kamiya's demonstrations of voluntary control, which highlighted the potential for real-time spectral feedback in behavioral conditioning. In clinical contexts, these methods underpin sleep staging, where elevated power (typically >75 μV waves) characterizes N3 , reflecting restorative processes and used quantitatively to assess depth. Despite their utility, frequency-domain methods assume signal stationarity—constant statistical properties over time—which EEG often violates due to its non-stationary nature from evolving brain states, leading to artifacts in spectral estimates for dynamic signals. This limitation necessitates complementary approaches for transient events, though global summaries remain valuable for baseline oscillatory profiling.

Advanced Analysis Techniques

Time-Frequency Methods

Time-frequency methods in EEG analysis address the non-stationary nature of brain signals by providing representations of temporal and content, enabling the detection of transient events that traditional time- or -domain approaches cannot resolve. These techniques are particularly valuable for EEG, where rhythmic activity varies over time in response to stimuli or cognitive processes, such as during or sensory evoked potentials. By balancing time and frequency resolution, they reveal modulations like bursts of oscillatory power that correlate with behavioral states. The Short-Time Fourier Transform (STFT) is a foundational time-frequency method that applies the Fourier Transform to short, overlapping windows of the EEG signal, yielding a spectrogram defined as the squared magnitude |STFT(t,f)|^2, which visualizes power distribution across time t and frequency f. This sliding-window approach captures local spectral changes but introduces a fundamental trade-off governed by the Heisenberg uncertainty principle: shorter windows provide better time resolution at the expense of frequency precision, and vice versa, limiting its adaptability to the multiscale variability in EEG. In practice, window lengths of 100-500 ms are often selected for EEG to balance resolution for event-related changes in alpha or beta bands. Wavelet Transform (WT) overcomes STFT's fixed resolution by using scalable, shiftable basis functions that provide variable time-frequency localization, making it ideal for the , transient features of EEG signals. The (CWT) convolves the signal with a family of wavelets scaled and translated continuously, while the (DWT) employs dyadic scales for efficient computation via multiresolution decomposition. A popular choice for EEG is the complex , defined as \psi(t) = e^{i \omega t} e^{-t^2 / 2}, which approximates a Gaussian-windowed sinusoid and excels at extracting phase and amplitude information from oscillatory bursts like induced gamma activity. The resulting scalogram, analogous to a , displays time-frequency power, facilitating visualization of non-stationary rhythms such as theta-alpha transitions during cognitive tasks. The Hilbert-Huang Transform (HHT) offers an adaptive, data-driven alternative for highly nonlinear and non-stationary EEG, starting with Empirical Mode Decomposition () to break the signal into intrinsic mode functions (IMFs) that satisfy conditions of local symmetry and zero mean envelopes. The EMD process involves iterative sifting: identifying local maxima and minima, interpolating upper and lower envelopes with cubic splines, and subtracting their mean from the signal until an IMF is obtained, repeating until a residue remains; this yields a sum of IMFs representing oscillatory components at varying frequencies. Each IMF then undergoes Hilbert transformation to compute instantaneous frequency and amplitude, forming the Hilbert spectrum—a time-frequency-energy distribution that highlights adaptive frequency modulations without predefined bases. HHT is particularly effective for denoising EEG or analyzing sleep stages, where traditional transforms blur intrinsic nonlinearities. In applications, time-frequency methods quantify event-related synchronization (ERS) and desynchronization (ERD), where ERS reflects increased post-stimulus power (e.g., mu-rhythm enhancement during idling) and ERD indicates power decreases (e.g., alpha suppression during movement preparation), as pioneered by Pfurtscheller in the 1990s using band-pass filtered EEG followed by squared and averaged time-frequency maps. These measures, computed as percentage changes relative to baseline (e.g., \text{ERD/ERS} = \frac{\text{bandpower} - \text{reference}}{\text{reference}} \times 100), reveal cortical activation patterns in motor and sensory tasks. Similarly, cross-frequency coupling, such as , where the phase of a (e.g., ) modulates the amplitude of a high-frequency one (e.g., gamma), is extracted via time-frequency representations to index hierarchical neural communication during or . Computational considerations for real-time implementation in brain-computer interfaces (BCI) emphasize efficiency, as STFT and DWT can process segments in milliseconds on standard hardware, enabling feedback latencies under 500 ms for control. However, CWT and HHT incur higher costs due to continuous scales and iterative sifting, respectively—HHT's may require 10-100 iterations per IMF—necessitating optimizations like partial reconstruction or GPU acceleration to achieve sub-second analysis for multi-channel EEG in portable BCIs. These trade-offs ensure feasibility for clinical applications like , where low-latency spectro-temporal features drive adaptive system responses.

Nonlinear Dynamics Methods

Nonlinear dynamics methods in EEG analysis draw from chaos theory and complexity science to characterize the irregular, potentially deterministic structures underlying electrical activity, which linear approaches often overlook. These techniques assume that EEG signals arise from low-dimensional nonlinear systems, enabling the quantification of sensitivity to initial conditions, , and generation rates. By reconstructing the from univariate , such methods reveal attractors whose properties—such as divergence rates and scaling—provide insights into states like or , distinct from oscillatory decompositions in time-frequency analysis. Phase space reconstruction is a foundational step, based on Takens' embedding theorem, which states that a dynamical system's can be faithfully reconstructed from a single observed variable using time delays, provided the embedding dimension m exceeds twice the 's dimension and the delay \tau is appropriately chosen. In EEG, this is implemented via delay coordinates, forming vectors \mathbf{x}_i = (x_i, x_{i+\tau}, \dots, x_{i+(m-1)\tau}) from the signal x(t), where m is selected using false nearest neighbors to avoid projections, and \tau is optimized by minimizing to capture independent dynamics without over- or under-sampling. This reconstruction allows visualization of trajectories in higher dimensions, exposing chaotic or periodic behaviors in epileptic EEG, though preprocessing to remove artifacts is essential for reliable . Lyapunov exponents quantify the rates of trajectory divergence or convergence in , with positive values indicating through exponential separation of nearby points. The largest Lyapunov exponent (LLE), the most relevant for detecting sensitivity, is estimated from reconstructed EEG using the Rosenstein algorithm, which tracks the evolution of nearest-neighbor distances without requiring a full spectrum, making it suitable for noisy, short recordings typical in clinical settings. In epileptic patients, LLE decreases during seizures, reflecting a transition to more ordered dynamics, and aids in localizing epileptogenic zones by comparing interictal values across electrodes. Fractal dimensions measure the geometric complexity and space-filling properties of attractors, revealing in EEG signals that suggests underlying deterministic rather than pure noise. The D_2, estimated via the Grassberger-Procaccia algorithm, is computed as D_2 = \lim_{r \to 0} \frac{\log C(r)}{\log r}, where C(r) is the counting pairs of points within distance r in the embedded space; higher D_2 values (around 4-7) in healthy EEG indicate greater dynamical complexity, while lower values (around 2-4) during seizures reflect reduced complexity due to neuronal . The H, derived from rescaled range analysis, assesses , with H > 0.5 indicating persistence (trending behavior) in EEG rhythms. Entropy measures evaluate signal irregularity and dynamical complexity, providing scalar indices of predictability. (ApEn) quantifies regularity by comparing template similarities within a r, but it includes self-matches, leading to bias in short EEG series. (SampEn) addresses this by excluding self-matches, defined as \text{SampEn}(m, r, N) = -\log \frac{A^m(r)}{B^m(r)}, where A^m(r) and B^m(r) are the probabilities of template matches for lengths m and m+1; higher SampEn in interictal EEG reflects greater irregularity than in ictal periods. The Kolmogorov-Sinai (KS) entropy extends these by estimating the average rate of information production in the , approximating the supremum of entropies over partitions; in EEG, elevated KS entropy pre-seizure indicates bifurcations toward chaotic attractors. These methods have proven valuable in detection, where nonlinear features like reduced LLE and altered during pre-ictal phases enable automated prediction with sensitivities up to 80% in intracranial recordings, outperforming linear metrics for identifying epileptogenic regions prior to . Seminal studies on intracranially recorded EEG demonstrate that and reductions mark the onset of , reflecting a loss of complexity as the synchronizes.

Machine Learning Methods

Machine learning methods in EEG analysis leverage data-driven algorithms to classify, predict, and extract features from brain signals, enabling scalable beyond traditional . These approaches integrate with preprocessing techniques to handle the high-dimensional, noisy nature of EEG data, often using supervised, , and paradigms for tasks like brain-computer interfaces (BCIs) and detection. By learning hierarchical representations, ML models improve accuracy in decoding cognitive states, such as , while addressing challenges like inter-subject variability through techniques like cross-validation. Feature extraction in ML-based EEG analysis frequently combines domain-specific methods like common spatial patterns (CSP) with classifiers to enhance discriminability between classes. CSP derives spatial filters by solving the optimization problem W = \arg\max_W \frac{\trace(W^T \Sigma_1 W)}{\trace(W^T \Sigma_2 W)}, where \Sigma_1 and \Sigma_2 are covariance matrices of two classes, maximizing variance for one class while minimizing it for the other; this is particularly effective for motor imagery BCIs. Riemannian geometry further advances this by treating covariance matrices as points on a manifold, enabling distance-based classification via metrics like the affine-invariant Riemannian metric, which preserves the positive-definite structure of EEG covariances for robust, geometry-aware feature spaces. Time-frequency features, such as spectrograms, can serve as inputs to these ML pipelines to capture dynamic spectral content. Supervised learning techniques, including support vector machines (SVM) and (LDA), are widely used for EEG classification, especially in motor imagery tasks where they achieve high accuracy by separating hyperplanes in feature space. For instance, SVM with kernels has demonstrated up to 98.8% accuracy in distinguishing left- versus right-hand motor imagery from filtered EEG epochs, outperforming alternatives like multilayer perceptrons. To mitigate in these high-dimensional datasets, k-fold cross-validation is employed, partitioning data into training and validation sets to ensure generalizability across subjects. Deep learning has transformed EEG analysis by enabling end-to-end learning that bypasses handcrafted features, with convolutional neural networks (CNNs) like EEGNet serving as a compact tailored for EEG. Introduced in , EEGNet uses depthwise and separable convolutions to learn spatial filters and temporal convolutions for sequence modeling, achieving state-of-the-art performance in BCI paradigms with fewer parameters than traditional CNNs, such as 90-95% accuracy on datasets. Recurrent neural networks (RNNs), particularly (LSTM) units, handle sequential dependencies in EEG trials for tasks like , while transformers apply self-attention mechanisms to capture long-range dependencies in extended recordings, improving of cognitive states by modeling global context. Unsupervised methods complement supervised approaches by identifying latent structures in unlabeled EEG data, such as through k-means clustering on embedding spaces derived from autoencoders to group similar brain states without prior labels. Autoencoders, including variational variants, reduce dimensionality while reconstructing EEG signals, facilitating anomaly detection in seizures by flagging deviations from normal patterns with sensitivities exceeding 95% in unsupervised settings. For epileptic seizure identification, these models learn baseline EEG distributions and detect outliers, as demonstrated in variational autoencoder frameworks that achieve robust performance on raw multichannel data. Recent advances emphasize and interpretability in clinical EEG applications, with enabling collaborative model training across institutions without sharing raw data, thus preserving patient in post-2020 deployments for . Explainable techniques, such as SHAP values, provide interpretability by attributing model to specific EEG features, revealing, for example, the influence of delta-band power on classifications and enhancing trust in outputs.

Practical Applications

Clinical and Diagnostic Uses

Electroencephalography (EEG) plays a pivotal role in clinical for diagnosing and monitoring various neurological disorders by capturing brain electrical activity to identify pathological patterns. In , EEG detects ictal events during s and interictal epileptiform discharges, such as spike-wave complexes, which are characteristic of syndromes like absence seizures. Initial routine EEG sensitivity for diagnosis ranges from 29% to 55%, but repeated recordings can achieve 80% to 90% sensitivity, aiding in syndrome classification and surgical planning. Quantitative EEG (qEEG) analysis, including power spectrum metrics, enhances prediction in some models, with sensitivities exceeding 80% and low false prediction rates, supporting preemptive interventions. In , EEG is integral to (PSG) for staging sleep disorders according to (AASM) rules, which define stages based on EEG waveforms like alpha for , theta for N1/N2, and delta for N3. These rules facilitate of by quantifying micro-arousals—brief EEG frequency shifts indicating sleep fragmentation—that disrupt continuity and contribute to . PSG integrates EEG with , , and respiratory monitoring to provide a comprehensive assessment, enabling accurate classification of disorders like or . For neurodegenerative conditions, EEG reveals characteristic alterations that support diagnosis and prognosis. In , reduced alpha power and increased theta activity reflect cortical slowing and cognitive decline, with occipital median frequency dropping to approximately 8.8 Hz compared to 9.2 Hz in controls. shows enhanced beta oscillations alongside theta slowing, with median frequency further reduced to 8.1 Hz, correlating with motor and cognitive impairments. In , EEG metrics like the bispectral index (BIS) predict outcomes; mean BIS values ≤25 at 12 hours post-event indicate poor neurological recovery with 49% (95% CI 30–65%) and 100% specificity (95% CI 47–100%), while the ratio quantifies suppressed periods, associating higher ratios with unfavorable prognoses in post-anoxic cases. FDA-approved EEG tools enhance clinical reliability, particularly in vulnerable populations. The Persyst 15 software, cleared in 2022, automates neonatal detection in EEG recordings from infants aged 36-44 weeks gestational equivalent, marking potential electrographic events using modified 10-20 montages and computing metrics like suppression ratio for analysis. This aids clinicians in quantifying burden and monitoring, though results require verification against raw EEG. Ethical considerations in EEG diagnostics emphasize mitigating risks from interpretive errors. False positives in clinical assessments of , with misdiagnosis rates up to 30-40% in evaluations of vegetative states, can lead to prolonged futile treatments or family distress in or cases, underscoring the need for oversight to integrate EEG with data. Such oversight ensures balanced , avoiding overreliance on automated tools and addressing potential misallocation of resources.

Brain-Computer Interfaces

Brain-computer interfaces (BCIs) based on (EEG) enable direct communication between the and external devices by detecting and interpreting neural signals in . These systems are particularly valuable for individuals with severe motor impairments, allowing control of cursors, spellers, or prosthetics without physical movement. EEG-based BCIs typically rely on non-invasive electrode caps placed on the to record brain activity, with variants improving portability and comfort for prolonged use. Common paradigms in EEG-BCIs include sensorimotor rhythms (SMR) for tasks, where users imagine movements to modulate (8-12 Hz) and (18-26 Hz) rhythms over the sensorimotor ; steady-state visually evoked potentials (SSVEPs), elicited by attending to flickering stimuli at frequencies such as 10-20 Hz, producing detectable oscillations in occipital regions; and the P300 speller, which uses event-related potentials generated by rare or task-relevant visual stimuli to select characters from a matrix. The signal processing pipeline generally involves preprocessing to remove artifacts, feature extraction using methods like common spatial patterns (CSP) to enhance class separability in multi-channel EEG data, and via algorithms such as , often achieving accuracies exceeding 70% for two-class tasks. Performance is quantified by the rate (ITR), calculated as ITR = \log_2 N + P \log_2 P + (1-P) \log_2 \frac{1-P}{N-1} in bits per trial (scaled to bits per minute based on trial duration), where N is the number of choices and P is accuracy; typical ITR values for EEG-BCIs range from 20 to 50 bits per minute. These paradigms often leverage event-related desynchronization/synchronization (ERS/ERD) in time-frequency analyses for refined signal discrimination. Hardware for EEG-BCIs consists of gel-based or electrode caps with 16-64 channels, increasingly incorporating transmission to reduce cabling constraints and enable mobile applications, though signal quality can degrade with movement. Key challenges include the need for extensive user training to achieve reliable , as proficiency varies widely, and mental from prolonged sessions, which reduces signal consistency and performance over time. In clinical settings, EEG-BCIs have been applied to patients, enabling communication through one-dimensional cursor based on SMR modulation, as demonstrated in early systems from the . BCIs combining EEG with (EMG) further enhance reliability by integrating residual muscle signals for multi-modal in partially paralyzed individuals. Recent advancements post-2015 incorporate for adaptive BCIs, where models dynamically adjust to inter-session variability in EEG patterns, improving robustness and achieving ITRs of 20-50 bits per minute in spelling tasks for disabled users. These developments have elevated typing speeds in P300-based systems to practical levels, facilitating everyday communication for those with severe motor disabilities.

Cognitive and Research Applications

EEG analysis plays a pivotal role in by enabling the investigation of brain mechanisms underlying , , and other mental processes in healthy individuals. Event-related potentials (), derived from time-locked EEG averaging, provide millisecond-resolution insights into cognitive events. For instance, the (MMN) is an early ERP component elicited by deviant stimuli in a sequence of standards, reflecting pre-attentive detection of auditory changes and serving as an index of automatic in research. Similarly, the FN400, a frontal negativity peaking around 400 ms post-stimulus, is associated with familiarity-based rather than explicit recollection, distinguishing it from later parietal effects linked to episodic retrieval. In studies, cross-frequency between (4-8 Hz) and gamma (30-100 Hz) oscillations has emerged as a key mechanism for coordinating information maintenance and manipulation. -gamma phase-amplitude , where gamma power modulates with phase, supports the temporal organization of neural representations during spatial and verbal tasks, with stronger correlating to better performance in healthy adults. This is thought to facilitate the nesting of high-frequency activity within slower rhythms, enabling efficient encoding and retrieval. EEG has also advanced the study of and through markers like frontal , where greater left-frontal alpha power suppression indicates approach-related positive , while right-frontal dominance relates to withdrawal and negative , as established in seminal work from the 1990s. are often modeled in a - circumplex, with representing pleasantness and intensity; EEG features such as alpha and beta band power in frontal and parietal regions differentiate these dimensions during affective tasks. The DEAP dataset, comprising EEG and physiological signals from 32 participants viewing music videos rated on and scales, has become a for validating models, demonstrating reliable classification accuracies above 60% using spectral features. Neurofeedback, involving real-time EEG feedback to train self-regulation, has been applied in research on and executive function, particularly in ADHD models. Alpha-theta protocols, which reward increases in relative to alpha power, aim to enhance relaxation and creativity, with meta-analyses from the showing moderate effect sizes (Cohen's d ≈ 0.5-0.8) for symptom reduction in ADHD, though sustained benefits require further longitudinal validation. These protocols leverage to modify oscillatory patterns, providing insights into in cognitive control. Hyperscanning techniques, which simultaneously record EEG from multiple participants, reveal inter-brain synchrony during social interactions. Measures like the phase-locking value (PLV) quantify how oscillatory phases align between dyads, with increased theta-band PLV observed in cooperative tasks, indicating shared neural representations that underpin and coordination. This approach has illuminated synchrony in naturalistic settings, such as , where PLV in alpha and bands correlates with rapport and mutual understanding. Despite these advances, EEG applications in cognitive research face limitations, including low due to constrained lab environments that may not capture real-world complexity, and high individual variability in ERP amplitudes and oscillatory patterns, which can confound group-level inferences. Nonlinear entropy measures, such as , offer ways to quantify by capturing irregular dynamics in EEG signals beyond linear methods.

Emerging and Industrial Uses

In , EEG analysis enables the measurement of implicit consumer preferences by assessing brain responses to stimuli, particularly through frontal alpha asymmetry to compute an approach-withdrawal index that indicates positive or negative emotional toward products or advertisements. This technique, developed post-2000, has been adopted by companies such as Nielsen, which acquired NeuroFocus in 2011 to integrate EEG-based testing for evaluating ad engagement and predicting viewer reactions in real-time. Systematic reviews highlight its growing application in understanding processes, with EEG outperforming traditional surveys in detecting subconscious biases. In the automotive sector, EEG analysis supports by monitoring changes in alpha power, where increased alpha activity signals reduced vigilance during prolonged driving. Systems in the , integrated into advanced driver assistance technologies similar to those in vehicles, use portable EEG headsets to provide alerts, achieving high accuracy (e.g., over 90% in benchmark datasets like SEED-VIG) through thresholds on features. These implementations enhance in commercial fleets and personal vehicles by processing signals on edge devices to minimize . For gaming and (VR), EEG analysis facilitates adaptive difficulty adjustment by tracking engagement metrics such as the beta/alpha power ratio, which rises with heightened attention and . Devices like Emotiv's EPOC headset and NeuroSky's MindWave enable this by classifying player states in , allowing games to modulate challenges to maintain optimal — for instance, increasing complexity when beta dominance indicates . Studies demonstrate that such improves in VR environments, with engagement indices correlating strongly to self-reported enjoyment across difficulty levels. Wearable EEG technologies, exemplified by the Muse headband introduced in 2014, monitor mental states during fitness and meditation activities by analyzing alpha and theta waves to provide biofeedback on focus and relaxation. These devices promote mental wellness in consumer products, tracking session progress via apps while adhering to privacy regulations like the EU's GDPR, which mandates explicit consent for data processing and secure storage of sensitive neural signals. Compliance ensures user trust, with features like opt-out options for data sharing integrated into device policies. Looking to future trends as of 2025, AI-augmented EEG analysis is poised for integration into applications through , enabling low-latency processing of signals for immersive, personalized experiences. Developments from 2023 onward emphasize hybrid edge AI-BCI systems that handle EEG data on-device to support avatar control and emotional in shared spaces, reducing dependency for and responsiveness. This convergence promises scalable deployments in industrial metaverses, with surveys noting improved efficiency in 6G-enabled environments.

References

  1. [1]
    Electroencephalography Signal Processing: A Comprehensive ... - NIH
    We present a detailed discussion and comparison of various methods and techniques used for EEG signal processing.
  2. [2]
    EEG Source Imaging: A Practical Review of the Analysis Steps - PMC
    This review explains these different steps and illustrates them in a comprehensive analysis pipeline integrated in a stand-alone freely available academic ...
  3. [3]
    The applied principles of EEG analysis methods in neuroscience ...
    Dec 19, 2023 · This article reviews the types of EEG signals, multiple EEG signal analysis methods, and the application of relevant methods in the neuroscience field
  4. [4]
    [PDF] Guidelines for Standard Electrode Position Nomenclature
    The 10-20 system of electrode placement, proposed by the International Federation of. Societies for Electroencephalography and Clinical Neurophysiology in ...
  5. [5]
    A high-density 256-channel cap for dry electroencephalography
    Nov 19, 2021 · We propose and evaluate a 256-channel cap with dry multipin electrodes for HD-EEG. We describe the designs of the dry electrodes made from polyurethane and ...
  6. [6]
    High Density EEG Neuroimaging - Electrical Geodesics, Inc.
    High Density EEG (HD EEG) with 128 or 256 electrodes is attracting interest as an important tool for use in epilepsy presurgical planning.
  7. [7]
    [PDF] Recording Clinical EEG on Digital Media
    Acquisition of EEG data onto a digital storage medium should occur at a minimum sampling rate of 256 samples per second. This is selected to be more than three ...Missing: hardware converters
  8. [8]
    Wireless EEG: A survey of systems and studies - ScienceDirect
    Apr 1, 2023 · Therefore, with the EEG activity of interest generally being up to around 50 Hz, sampling rates of 250 or 500 are common. Some recent wireless ...
  9. [9]
    Appendix 4. Common Artifacts During EEG Recording - NCBI
    Physiological artifacts may include cardiac, pulse, respiratory, sweat, glossokinetic, eye movement (blink, lateral rectus spikes from lateral eye movement), ...
  10. [10]
    Normal variants and artifacts: Importance in EEG interpretation - Amin
    Mar 20, 2023 · The source of the muscle artifact is myogenic activity produced near the recording EEG electrodes. The most common generators are the frontal ...
  11. [11]
    Routine and sleep EEG: minimum recording standards of the ... - NIH
    For visualization (display), the suggested low-pass (high frequency) filter setting is 70 Hz, and the suggested high-pass (low frequency) filter setting is 0.5 ...
  12. [12]
    Filters in the Electroencephalogram | Neupsy Key
    Mar 12, 2017 · The three commonly used filter types in clinical EEG are low-frequency filters, high-frequency filters, and notch filters. The purpose of a low- ...
  13. [13]
    [PDF] Independent Component Analysis: A Tutorial
    The ICA model is a generative model, which means that it describes how the observed data are generated by a process of mixing the components si. The independent ...
  14. [14]
    Re-referencing - EEGLAB Wiki
    Converting data, before analysis, from fixed or common reference (for example, from a common earlobe or other channel reference) to 'average reference' is ...
  15. [15]
    2.7: Exercise- Epoching and Baseline Correction
    Apr 10, 2022 · The second operation will be baseline correction, which is a way of dealing with the DC voltage offsets.
  16. [16]
    EEG Recording and Analysis for Sleep Research - PMC - NIH
    Feb 18, 2010 · If taped-on electrodes exhibit high impedance (>5 KOhm), remove the tape from the skin and the electrode. Slide the electrode around on the skin ...
  17. [17]
    Comparison between a wireless dry electrode EEG system with a ...
    Mar 23, 2020 · Electrode impedances were kept below 5 KOhm in all recordings and electrode sites. The input impedance of the EEG amplifier was> 100 MOhm.
  18. [18]
    Advances in healthcare wearable devices | npj Flexible Electronics
    Apr 12, 2021 · This paper reviews different types of wearable devices currently being used in the healthcare field. It also highlights their efficacy in monitoring different ...
  19. [19]
    Hans Berger (1873–1941): the German psychiatrist who recorded ...
    Oct 22, 2024 · He developed the electroencephalogram (EEG) and identified alpha waves in the human scalp. Although initially met with skepticism, his work ...
  20. [20]
    A history and review of quantitative electroencephalography in ...
    This review discusses the history and significance of conventional EEG and provides a review of how FT-EEG, commonly referred to as Quantitative EEG (QEEG), is ...
  21. [21]
    Review of electroencephalography signals approaches for mental ...
    The EEG waveforms are described based on their amplitude, location, frequency, symmetry, and reactivity. Frequency is the most commonly utilized technique for ...
  22. [22]
    Removal of Artifacts from EEG Signals: A Review - PMC - NIH
    Feb 26, 2019 · The cause of artifacts may arise from measurement instrument and human subjects, the prior one including faulty electrodes, line noise and high ...
  23. [23]
    Everything you wanted to ask about EEG but were afraid to get the ...
    Q4. Is EEG a stationary signal? ... A4. No, EEG signal is nonstationary. In general, biosignals are '3N' – Nonstationary, Nonlinear, Noisy. Nonstationarity means ...Missing: varying ergodicity SNR
  24. [24]
    EEG Source Imaging: A Practical Review of the Analysis Steps
    However, it is generally stated that EEG suffers from a poor spatial resolution that makes it difficult to infer to the location of the brain areas generating ...
  25. [25]
    Methods of EEG Signal Features Extraction Using Linear Analysis in ...
    Stages of EEG signal processing. This paper presents a short review of mathematical methods for extracting features from EEG signals. The review considers five ...
  26. [26]
  27. [27]
    EEG Waveform Analysis of P300 ERP with Applications to Brain ...
    Nov 16, 2018 · The aim of this study is threefold: first to review current literature of EEG processing techniques which are based on analysis of the waveform.
  28. [28]
  29. [29]
  30. [30]
    Time domain measures of inter-channel EEG correlations
    Four time domain measures were compared: Pearson product moment correlation, Spearman rank order correlation, Kendall rank order correlation and mutual ...
  31. [31]
  32. [32]
    Peak selection and latency jitter correction in developmental event ...
    Peak latencies pre- and postcorrection are reported for each component. Lastly, we used bootstrapping to estimate the standardized measurement error (SME) for ...
  33. [33]
  34. [34]
    Spectral analysis methods for neurological signals - ScienceDirect
    This paper reviews some novel spectral analysis techniques that are useful for neurological signals in general and EEG signals in particular.
  35. [35]
    The applied principles of EEG analysis methods in neuroscience ...
    Dec 19, 2023 · This article reviews the types of EEG signals, multiple EEG signal analysis methods, and the application of relevant methods in the neuroscience field
  36. [36]
    EEG and MEG coherence: Measures of functional connectivity at ...
    In contrast to amplitude measures, coherence is a measure of synchronization between two signals based mainly on phase consistency; that is, two signals may ...Eeg And Meg Coherence... · Introduction · Eeg And Meg Are Sensitive To...
  37. [37]
    Coherence a measure of the brain networks: past and present
    Jan 17, 2016 · Coherence and Phase synchrony are the building blocks for understanding brain connectivity, how the populations of neurons communicate around a ...
  38. [38]
    When Was Neurofeedback Invented and What's the Science Behind It
    Jan 23, 2018 · Neurofeedback was pioneered in the late 1950s and 1960s by two researchers: Dr. Joseph Kamiya at the University of Chicago and Dr. Barry Sterman at UCLA.Missing: Walter | Show results with:Walter<|separator|>
  39. [39]
    Sleep Quality and Electroencephalogram Delta Power - PMC
    Delta activity on electroencephalogram (EEG) is considered a biomarker of homeostatic sleep drive. Delta power is often associated with sleep duration and ...
  40. [40]
    Methods of EEG Signal Features Extraction Using Linear Analysis in ...
    Feb 13, 2014 · Frequency domain methods may not provide high-quality performance for some EEG signals.2. Methods · 2.2. Wavelet Transform (wt)... · 2.5. Autoregressive Method<|control11|><|separator|>
  41. [41]
    Wavelets for EEG Analysis - IntechOpen
    With a short overview of wavelet analysis techniques, namely; Continues Wavelet Transform (CWT), Discrete Wavelet Transform (DWT), and Wavelet Packet ...
  42. [42]
    Comparison of STFT and wavelet transform methods in determining ...
    In this study, the short-time Fourier transform (STFT) and wavelet transform (WT) were applied to EEG signals obtained from a normal child and from a child ...Missing: seminal | Show results with:seminal
  43. [43]
    Short-time Fourier transform and embedding method for recurrence ...
    Oct 21, 2022 · Here we introduce a new method of pre-processing the time-series for the analysis of the resting state and binary task classification using recurrence ...
  44. [44]
    Seizure Prediction in EEG Signals Using STFT and Domain ...
    Jan 18, 2022 · Here we adopt the short-time Fourier transform (STFT) to produce feature maps from raw EEG sequences. The conversion transforms the EEG time ...Missing: seminal papers
  45. [45]
    Wavelet-based EEG processing for computer-aided seizure ...
    There are two types of wavelet analysis: Continuous Wavelet Transforms (CWT) and Discrete Wavelet Transform (DWT) [12]. The CWT coefficients are evaluated ...
  46. [46]
    The fast continuous wavelet transformation (fCWT) for real-time, high ...
    Jan 27, 2022 · fCWT and CWT use the Morlet wavelet (σ = 6) and 480 frequencies to divide the spectrum, DWT uses 11 levels of 15-order Debauchie wavelet ...
  47. [47]
    Wavelet Transform as a Helping Tool During EEG Analysis in ... - NIH
    This paper represents some results relating to the analysis of EEG in children using Wavelet Transform (WT).
  48. [48]
    (PDF) Hilbert-Huang transform used for EEG signal analysis
    Mar 19, 2021 · Hilbert Huang Transform (HHT), which is based on EMD (Empirical Mode Decomposition) and Hilbert transform method, is a new signal analysis ...
  49. [49]
    Hilbert-Huang Transform - an overview | ScienceDirect Topics
    The Hilbert–Huang transform is defined as a method that decomposes a time series into multiple components using empirical mode decomposition, allowing for ...
  50. [50]
  51. [51]
    an electrophysiological correlate of cortical areas at rest - PubMed
    The former is named event-related desynchronization (ERD) and the latter event-related synchronization (ERS). Examples of ERS are localized alpha ...Missing: 1990s | Show results with:1990s
  52. [52]
    (PDF) Real-time EEG analysis with subject-specific spatial patterns ...
    Aug 10, 2025 · Three subjects participated in a series of on-line sessions to test if it is possible to use common spatial patterns to analyze EEG in real time ...
  53. [53]
    EEG-based brain-computer interface enables real-time robotic hand ...
    Jun 30, 2025 · In a study involving 21 able-bodied experienced BCI users, we achieved real-time decoding accuracies of 80.56% for two-finger MI tasks and 60.61 ...
  54. [54]
    Epilepsy and Nonlinear Dynamics - PMC - NIH
    Nonlinear EEG analyses can help to improve the evaluation of patients prior to neurosurgery, and with an unequivocal identification of precursors of seizures, ...
  55. [55]
    Deep learning approach to detect seizure using reconstructed ...
    In order to capture these nonlinear dynamics, we use reconstructed phase space (RPS) representation of the signal.
  56. [56]
    Analysis of EEG data using complex geometric structurization - PMC
    To understand how to reconstruct the phase space of a dynamical system based on observations (times series) of one of its variables, we need to revisit the ...
  57. [57]
    [PDF] A practical method for calculating largest Lyapunov exponents from ...
    Nov 20, 1992 · For this reason, we have developed a new method for calculating the largest Lyapunov exponent. The method is reliable for small data sets, fast, ...
  58. [58]
    [PDF] Physica 9D (1983) 189-208 North-Holland Publishing Company ...
    We study the correlation exponent v introduced recently as a characteristic measure of strange attractors which allows one.
  59. [59]
    Brain entropy, fractal dimensions and predictability: A review of ...
    A phenomenon that is closely related to FD is the Hurst exponent (HE, Hurst, 1951; Mandelbrot & Wallis, 1969). Both FD and HE seek to approximate the temporal ...
  60. [60]
    Approximate entropy as a measure of system complexity. - PNAS
    Mar 15, 1991 · ApEn can classify complex systems, given at least 1000 data values in diverse settings that include both deterministic chaotic and stochastic processes.
  61. [61]
    Physiological time-series analysis using approximate entropy and ...
    Physiological time-series analysis using approximate entropy and sample entropy. Authors: Joshua S. Richman and J. Randall MoormanAuthors Info & Affiliations.
  62. [62]
    Kolmogorov-Sinai entropy - Scholarpedia
    Mar 23, 2009 · Kolmogorov was the first to suggest that the entropy must be positive. After that, it became possible to prove the needed result and there ...Missing: original | Show results with:original
  63. [63]
    Nonlinear EEG analysis and its potential role in epileptology - PubMed
    Nonlinear time series analysis provides new and supplementary information about the epileptogenic process and thus contributes to an improvement in presurgical ...
  64. [64]
    Classification of Motor Imagery EEG Signals with Support Vector ...
    This study uses Support Vector Machines (SVM) with Particle Swarm Optimization (PSO) to classify motor imagery EEG signals, improving classification accuracy.
  65. [65]
  66. [66]
    [PDF] Classification of covariance matrices using a Riemannian ... - HAL
    May 5, 2013 · Abstract. The use of spatial covariance matrix as a feature is investigated for motor imagery EEG-based classification.
  67. [67]
    Riemannian geometry-based transfer learning for reducing training ...
    Jun 14, 2022 · The former classifies an EEG trial by comparing the Riemannian distances of its covariance matrix to the Riemannian means of all classes, ...
  68. [68]
    Motor-Imagery EEG Signals Classificationusing SVM, MLP and LDA ...
    Apr 11, 2021 · The results exhibited that the SVM method was the best approach with 98.8% classification accuracy followed by MLP classifier. Finally, the SVM ...
  69. [69]
    EEGNet: A Compact Convolutional Network for EEG-based Brain ...
    Nov 23, 2016 · We introduce EEGNet, a compact convolutional network for EEG-based BCIs. We introduce the use of depthwise and separable convolutions to construct an EEG- ...
  70. [70]
    EEGformer: A transformer–based brain activity classification method ...
    Mar 23, 2023 · We proposed a transformer–based EEG analysis model known as EEGformer to capture the EEG characteristics in a unified manner.
  71. [71]
    Enhancing EEG signals classification using LSTM‐CNN architecture
    Dec 8, 2023 · This paper provides a methodology to improve EEG classification by combining a CNN model with an RNN-based sequence modeling technique like LSTM ...
  72. [72]
    Unsupervised seizure identification on EEG - PMC - NIH
    We propose the first fully-unsupervised deep learning method for seizure identification on raw EEG, using a variational autoencoder (VAE).
  73. [73]
    A Shallow Autoencoder Framework for Epileptic Seizure Detection ...
    This paper presents a trainable hybrid approach involving a shallow autoencoder (AE) and a conventional classifier for epileptic seizure detection.
  74. [74]
  75. [75]
    Using Explainable Artificial Intelligence to Obtain Efficient Seizure ...
    Dec 16, 2023 · To achieve this, we applied an interpretable method—SHAP—to optimize machine learning models that perform binary classifications of EEG segments ...<|control11|><|separator|>
  76. [76]
    EEG and Epilepsy Monitoring - PMC - PubMed Central - NIH
    This article reviews the utility of EEG and prolonged video-EEG telemetry in the diagnosis and management of a patient with epilepsy.Missing: applications | Show results with:applications
  77. [77]
    An end-to-end seizure prediction approach using long short-term ...
    May 18, 2023 · As a result, the performance of seizure prediction is further improved, obtaining a higher sensitivity of 92.17% and a low FPR of 0.27/h. The ...Missing: qEEG | Show results with:qEEG
  78. [78]
    AASM Scoring Manual - American Academy of Sleep Medicine
    Apr 9, 2025 · This comprehensive and evolving resource provides rules for scoring sleep stages, arousals, respiratory events during sleep, movements during sleep and cardiac ...Scoring Manual Version 3... · Scoring Manual 3 Access · Summary Of Updates<|separator|>
  79. [79]
    Slowing of EEG Background Activity in Parkinson's and Alzheimer's ...
    Slowing of the electroencephalogram (EEG) is frequent in Parkinson's (PD) and Alzheimer's disease (AD) and correlates with cognitive decline.
  80. [80]
    Bispectral index in predicting the prognosis of patients with coma in ...
    BIS value is correlated with the prognosis of patients with coma in ICU, and BIS value can be a useful marker for estimating the prognosis of comatose patients.
  81. [81]
    Poor outcome prediction by burst suppression ratio in adults with ...
    BSR may be a better predictor in prognosticating poor outcomes in patients with post-anoxic coma who do not undergo therapeutic hypothermia when compared to ...
  82. [82]
    [PDF] K222002.pdf - accessdata.fda.gov
    Dec 30, 2022 · Persyst 15 EEG ... The Neonatal Seizure Detection component of Persyst 15 is intended to mark previously acquired sections of neonatal.
  83. [83]
    Risk, diagnostic error, and the clinical science of consciousness - PMC
    We argue that false positive and false negative findings carry particular moral risks, which may bear on investigators' decisions to use certain methods when ...
  84. [84]
    Noninvasive Electroencephalography Equipment for Assistive ...
    What are the most common types (wired or wireless) of noninvasive EEG-based BCI equipment that have been used in brain studies? To discover whether the EEG ...
  85. [85]
    Brain-Computer Interfaces Using Sensorimotor Rhythms
    This article reviews the current state and future perspectives of SMR-based BCI and its clinical applications, in particular focusing on the EEG SMR.
  86. [86]
    Towards an Independent Brain - Computer Interface Using Steady ...
    Hence, an SSVEP BCI can infer user intent by measuring EEG activity at a specific frequency or frequencies over occipital areas. Although SSVEP BCIs work with ...
  87. [87]
    [PDF] toward a mental prosthesis utilizing event-related brain potentials ...
    Summary. This paper describes the development and testing of a system whereby one can communicate through a computer by using the P300 component of the ...
  88. [88]
    Transformed common spatial pattern for motor imagery-based brain ...
    Mar 6, 2023 · The combination of tCSP and CSP further improved the system performance with an average accuracy of 84.77% and a peak accuracy of 100%. For ...Abstract · Introduction · Materials and methods · Discussion
  89. [89]
    Estimating and approaching the maximum information rate of ...
    Apr 1, 2024 · The ITR is calculated as follow:(10) ITR = 60 · ( log 2 M + P log 2 P + ( 1 − P ) log 2 1 − P M − 1 ) / T , where M, P, and T denote the number ...
  90. [90]
    Effects of user mental state on EEG-BCI performance - PMC - NIH
    There appear to be optimal operating conditions where fatigue, frustration, and attention levels are most appropriate for effective control of an EEG-BCI.
  91. [91]
    An EEG-based brain-computer interface for cursor control
    This study began development of a new communication and control modality for individuals with severe motor deficits. We trained normal subjects to use the ...
  92. [92]
    Hybrid Brain–Computer Interface Techniques for Improved ...
    Fully locked-in patients with no muscular activity detectable by EMG signals. The remaining three types of patients were categorized using EOG and EEG signals. ...
  93. [93]
    Machine-learning-enabled adaptive signal decomposition for a ...
    Aug 7, 2025 · The proposed P300 based BCI model has the highest average information transfer rates of 13.23 to 26.48 bits/min for disabled subjects. The ...
  94. [94]
    Review The mismatch negativity in cognitive and clinical neuroscience
    Mismatch negativity (MMN) component of the event-related brain potentials has become popular in cognitive and clinical brain research during the recent years.
  95. [95]
    The FN400 indexes familiarity-based recognition of faces - PMC - NIH
    Separate event-related brain potential (ERP) components have been hypothesized to index familiarity and recollection processes that support recognition memory.
  96. [96]
    Spatial Working Memory in Humans Depends on Theta and High ...
    May 26, 2016 · Theta-Gamma Coupling Organizes Spatial Working Memory. In the first set of experiments, we evaluated the spatial working memory performance ...
  97. [97]
    Theta–gamma coupling as a ubiquitous brain mechanism
    In this review, we summarize recent experimental and theoretical evidence linking the theta–gamma code with working, episodic, and semantic memory, attention, ...
  98. [98]
    Approach-withdrawal and cerebral asymmetry - APA PsycNET
    Frontal brain asymmetry predicts infants ... EEG alpha activity reflects attentional demands, and beta activity reflects emotional and cognitive processes.
  99. [99]
    Decoding the neural signatures of valence and arousal ... - Frontiers
    This paper focuses on classifying emotions on the valence-arousal plane using various feature extraction, feature selection, and machine learning techniques.Abstract · Introduction · Materials and methods · Discussion
  100. [100]
    DEAP: A Database for Emotion Analysis ;Using Physiological Signals
    Jun 9, 2011 · We present a multimodal data set for the analysis of human affective states. The electroencephalogram (EEG) and peripheral physiological signals
  101. [101]
    Clinical and Experimental Factors Influencing the Efficacy ... - Frontiers
    Feb 17, 2019 · Meta-analyses have been extensively used to evaluate the efficacy of neurofeedback (NFB) treatment for Attention Deficit/Hyperactivity Disorder (ADHD) in ...
  102. [102]
    Sustained effects of neurofeedback in ADHD: a systematic review ...
    This systematic review and meta-analysis addresses the sustainability of neurofeedback and control treatment effects by considering randomized controlled ...
  103. [103]
    Hyperscanning: A Valid Method to Study Neural Inter-brain ...
    PLV measures how two signals (in case of hyperscanning coming from two different brains) are phase-locked in the observed time window. PLV is equal to 1 when ...
  104. [104]
    On the interpretation of synchronization in EEG hyperscanning studies
    Dec 24, 2013 · EEG Hyperscanning is a method for studying two or more individuals simultaneously with the objective of elucidating how co-variations in their neural activity
  105. [105]
    Neuroimaging of learning and development: improving ecological ...
    Two important factors compromising the ecological validity of neuroimaging studies are 1) the highly controlled stimuli and tasks used and 2) the artificial and ...
  106. [106]
    Stability, change, and reliable individual differences in ...
    The case of EEG measures of change and variation demonstrate how individual difference analyses do not necessarily preclude within-person variability that may ...
  107. [107]
    A gateway to consumers' minds: Achievements, caveats, and ...
    Nov 29, 2018 · A more popular measure for prediction is alpha frontal asymmetry, which seems to center on the idea of approach/withdrawal, rather than on a ...
  108. [108]
    Nielsen Buys Into Neuromarketing
    Nielsen CEO David Calhoun will assume a position on the board of NeuroFocus. NeuroFocus has used EEG as its primary method of measuring brain activity.
  109. [109]
    A systematic review on EEG-based neuromarketing - PubMed Central
    Jun 5, 2024 · Neuromarketing is an emerging research field that aims to understand consumers' decision-making processes when choosing which product to buy ...
  110. [110]
    Detecting Driver Mental Fatigue Based on EEG Alpha Power ...
    This study aimed to analyze the EEG alpha power changes in partially sleep-deprived drivers while performing a simulated driving task.Missing: 2020s | Show results with:2020s
  111. [111]
    Enhancing Automotive Safety through EEG-Based Fatigue Detection
    Aug 25, 2024 · This study introduces a method for detecting driver fatigue using the SEED-VIG dataset, a well-established benchmark in EEG-based vigilance ...
  112. [112]
    How EEG is Changing Driver Fatigue Detection in Real Time - Bitbrain
    May 7, 2025 · EEGs' real-time monitoring capabilities make them ideal tools for continuous fatigue detection in transportation safety.Missing: 2020s | Show results with:2020s
  113. [113]
    EEG-Based Engagement Monitoring in Cognitive Games - PMC
    Mar 26, 2025 · This study aims to explore the relationship between game difficulty levels, three EEG engagement indices (β/(θ + α), β/α, 1/α), and the self-reported flow ...
  114. [114]
    [PDF] Evaluating Electroencephalography Engagement Indices During ...
    Our findings suggest the Emotiv EEG can be used to assess varying levels of engagement as game players experience varying gaming events. Using an averaged band ...
  115. [115]
    Using EEG data as Dynamic Difficulty Adjustment in a serious game ...
    Sep 6, 2023 · EEG data was used to adjust game difficulty based on attention and calm levels, aiming to match player skill and enable a stable flow state.Missing: beta alpha
  116. [116]
  117. [117]
    [PDF] Assessing the Privacy Practices of Consumer Neurotechnology ...
    ... Muse device”133 (Muse). ... BrainBit sells various hardware and software products, including MINDO, an EEG headband intended to increase relaxation and focus.
  118. [118]
  119. [119]
    Edge AI-Brain-Computer Interfaces System: A Survey - ResearchGate
    Oct 24, 2025 · This survey provides a comprehensive examination of Edge AI–enabled BCI systems, covering the full pipeline from EEG hardware specifications and ...
  120. [120]
    A Comprehensive Survey on Emerging AI Technologies for 6G ...
    Naturally fusing Metaverse with 6G based edge AI is investigated, Three novel edge-metaverse designs are presented that leverage edge AI provided by 6G to ...