Fact-checked by Grok 2 weeks ago

Emotion perception

Emotion perception refers to the cognitive process by which individuals detect, recognize, and interpret emotional states in others through sensory cues such as facial expressions, vocal prosody, and tactile signals. This ability is fundamental to , enabling effective communication, , and interpersonal coordination in human interactions. It involves rapid neural processing that integrates multimodal inputs to form coherent emotional judgments, often occurring within milliseconds of stimulus onset. Key modalities in emotion perception include visual cues from faces, which are decoded using systems like the (FACS) to identify specific muscle movements associated with emotions such as or . Auditory signals from voice convey emotions through variations in pitch, tempo, and intensity, activating regions like the secondary auditory cortex. Tactile perception, often overlooked, processes affective touch via specialized C-tactile afferents, primarily eliciting positive emotions and contributing to bonding. These modalities converge in the brain, with early integration in subcortical structures like the and enhancing recognition accuracy. The neural basis of emotion perception encompasses a distributed network, including the for facial processing, the for rapid threat detection, and the insula for interoceptive aspects of emotion. Cognitive mechanisms, such as attentional biases and conceptual categorization, further shape perception, with theories like appraisal models emphasizing evaluation of relevance and novelty in emotional stimuli. Impairments in this process are linked to psychiatric conditions, including autism spectrum disorder and , where deficits in facial correlate with social dysfunction. Cultural influences significantly modulate emotion perception, with universal elements like basic coexisting alongside culture-specific and cognitive styles. For instance, Western cultures often emphasize high-intensity facial prototypes and focus on eyes and mouth, while East Asian cultures prioritize contextual integration and subtler expressions, leading to differences in accuracy across groups. These variations highlight the interplay between biological universals and learned norms in shaping emotional understanding.

Conceptual Foundations

Definition and Scope

Emotion perception refers to the process by which individuals detect, interpret, and attribute emotional states to others through sensory cues, encompassing early stages of receptive processing such as detection and of emotional signals. This process relies on rapid, automatic neural responses that enable the initial decoding of nonverbal indicators like facial movements, vocal inflections, or tactile signals, without necessarily requiring conscious . Unlike emotion generation, which involves producing one's own affective responses, or self-focused emotion , emotion perception is delimited to the of others' internal states, forming a boundary condition for interpersonal affective exchange. A key distinction exists between emotion perception and related constructs: while emphasizes the accurate identification and categorization of specific emotions (e.g., labeling a as ""), prioritizes the foundational sensory and interpretive steps that precede such judgments. Similarly, emotion understanding extends beyond to include causal inferences about the origins or implications of an observed , integrating broader contextual or motivational factors. These differentiations highlight as an input mechanism rather than an endpoint, providing raw affective data for higher-level social processing. The core components of emotion perception include sensory detection of cues, initial into broad emotional valences (positive or negative), and contextual to refine attributions based on situational elements. Detection operates at a pre-attentive level, rapidly signaling potential emotional relevance; groups cues into affective classes; and adjusts interpretations using environmental or relational to avoid misattribution. This tripartite structure ensures efficient processing in dynamic social environments. As a cornerstone of , emotion perception underpins by facilitating the of others' affective experiences and supports effective communication through synchronized interpersonal responses. Impairments in this process, as observed in various clinical populations, disrupt these functions, underscoring its essential role in fostering mutual understanding and cooperative interactions.

Historical Development

The study of emotion perception traces its roots to the 19th century, when Charles Darwin published The Expression of the Emotions in Man and Animals in 1872, proposing that emotional expressions evolved as adaptive signals for communication and survival across humans and other animals. Darwin's work emphasized the universality and biological basis of these expressions, laying the groundwork for later empirical investigations into how emotions are conveyed and recognized through observable behaviors. In the 20th century, research shifted toward psychological theories of innate emotional displays. advanced in his 1962 book Affect Imagery Consciousness: The Positive Affects, arguing that facial expressions of basic affects are hardwired and central to human motivation and social interaction. Building on this, conducted in the 1970s that supported the existence of universal basic emotions. A landmark 1971 study by Ekman and Wallace Friesen examined the , an isolated tribe in with minimal exposure to Western media, who accurately recognized six basic emotions—, , , , , and —from facial photographs, confirming cross-cultural invariance in emotion perception. Key milestones emerged in the 1980s with the development of standardized tasks, such as those using Ekman's (FACS) to measure and elicit specific expressions for experimental testing of recognition accuracy. By the 1990s, integration with neuroscience advanced the field through early (fMRI) studies, including Breiter et al.'s 1996 work demonstrating activation in response to fearful facial expressions, linking perceptual processes to neural substrates. This period marked a transition from behavioral to brain-based approaches. Contemporary views began challenging universality in the 2010s with the rise of constructed emotion theories, exemplified by Lisa Feldman Barrett's 2017 book How Emotions Are Made, which posits that emotions are not innate categories but dynamically constructed from sensory inputs, concepts, and cultural learning.

Perceptual Modalities

Visual Perception

Visual perception plays a central role in emotion perception, primarily through the decoding of facial expressions and body movements that convey emotional states. Facial expressions serve as the dominant visual cue, systematically mapped by the (FACS), which identifies specific muscle movements, or action units, associated with universal emotions such as (e.g., furrowed brows via corrugator contraction) or (e.g., cheek raising via zygomaticus major). Developed by and Wallace Friesen, FACS enables precise analysis of these movements, revealing how subtle variations in facial muscle activation signal distinct emotions across cultures. Complementing facial cues, body posture and gestures provide additional contextual information; for instance, slumped shoulders and a forward-leaning often indicate by conveying defeat or withdrawal. The processing of these visual cues occurs in rapid stages, beginning with detection of emotional signals within approximately 100 milliseconds of stimulus onset, allowing for quick orienting toward potentially threatening or faces. This initial detection is followed by categorization into basic emotions, such as or , based on configural patterns in facial features like eye and mouth shape. Subsequent modulation by contextual elements, such as surrounding scenes or gaze direction, refines interpretation; for example, a face viewed against a fearful background may be perceived as more anxious due to incongruent emotional cues. A key distinction in visual emotion perception arises between posed and spontaneous expressions, with the latter offering greater for real-world interactions. Recent studies indicate that recognition accuracy for posed expressions, which are deliberately exaggerated, is higher than for spontaneous ones due to their clarity, though spontaneous displays better capture nuanced, involuntary emotional leakage. For instance, 2025 analyzing dynamic datasets found posed recognized at 85% accuracy versus 60% for spontaneous equivalents, highlighting how artificiality in posed stimuli can inflate perceptual benchmarks. Individual differences significantly influence visual emotion perception accuracy, particularly through expertise effects observed in professionals like clinicians. Training in leads to improvements of 15-24% in recognition rates for subtle emotional cues. This expertise extends to the detection of micro-expressions, brief involuntary flashes lasting less than 0.5 seconds that betray concealed emotions, such as a fleeting during feigned . Pioneered in Ekman's , micro-expression training has been shown to significantly improve detection rates, underscoring its utility in high-stakes contexts like or security.

Auditory Perception

Auditory perception plays a crucial role in through the analysis of vocal cues, particularly prosody and paralinguistic elements, which convey emotional states via acoustic variations in speech. Vocal prosody encompasses suprasegmental features such as (), (speech rate), and (), which differentiate emotions like (characterized by rapid and high ) from (marked by slower and lower ). Paralinguistic elements, including non-verbal sounds like sighs, further signal specific emotions; for instance, prolonged exhalations often indicate by mimicking physiological responses to distress. These cues are processed rapidly in the auditory stream, allowing listeners to infer a speaker's emotional intent even without linguistic content, as demonstrated in studies using pseudo-speech stimuli that isolate prosodic contours. The processing of auditory emotional cues is modulated by contextual factors, including with semantic content and the inherent level of the . , such as an angry voice paired with semantically negative words, facilitates faster detection and higher accuracy compared to incongruent pairings. Semantics exerts a strong influence, often overriding prosodic signals in ambiguous cases; for example, lexical content can interpretation toward the dominant emotional , leading to integration effects where prosody alone is insufficient for disambiguation. A specific example is the detection of , which relies on rising contours that signal urgency and , enabling quicker identification than static low- cues. Recent research indicates that voice-alone recognition achieves 60-70% accuracy for basic emotions (e.g., , , , ) in controlled settings, though this drops in real-world or with less prototypical expressions. Challenges persist in perceiving neutral tones, which often exhibit low variability in prosody and are prone to , frequently misinterpreted as negative due to perceptual biases toward threat detection. Cultural variations further complicate intonation patterns; for instance, East Asian listeners show reduced sensitivity to prosodic cues for high-arousal emotions compared to counterparts, reflecting divergent expressive norms. enhancement occurs when auditory cues integrate with visual signals, boosting overall accuracy beyond voice alone.

Olfactory and Somatic Perception

Olfactory cues play a subtle yet significant role in emotion , particularly through body odors and pheromones that signal emotional states subconsciously. odors, such as those produced during , can convey emotional information implicitly, influencing perceivers' responses without conscious . For instance, sweat collected from individuals experiencing has been shown to alter facial mimicry and neural responses associated with , suggesting an automatic detection mechanism. Pheromones like , found in male sweat, enhance the perceived dominance of faces, particularly in observers with high , thereby modulating social-emotional judgments. These chemical signals often operate below , priming emotional interpretations in close-proximity interactions. Somatic perception, involving tactile and haptic cues, contributes to through physical sensations that evoke affiliative or aversive responses. Warmth from touch, such as holding a warm object, promotes perceptions of social affiliation and trustworthiness, linking physical to interpersonal warmth in frameworks. Conversely, haptic stimuli conveying tension, like high-intensity vibrations, are associated with negative emotions such as , heightening and discomfort in recipients. These tactile cues facilitate emotion communication in intimate settings, where touch conveys or more directly than in distant interactions. The processing of olfactory and somatic cues in emotion perception is typically implicit and priming-based, differing from the rapid explicit decoding in visual or auditory modalities. Olfactory signals, for example, influence judgments through subliminal exposure, modulating and without deliberate , and are generally slower due to the chemosensory pathway's compared to visual or auditory processing. Recent studies indicate that such cues can significantly shape evaluations in proximal s. This implicit integration often amplifies other sensory inputs, enhancing overall emotional in multisensory environments.

Theoretical Frameworks

Physiological Theories of Emotion Perception

Physiological theories of emotion perception posit that the recognition of emotions in others arises from the observer's own bodily responses to perceived physiological cues, such as facial expressions or postural changes, providing a bottom-up mechanism for interpretation. These theories emphasize how and facilitate the attribution of emotional states, distinguishing them from purely cognitive evaluations by grounding in visceral and motor simulations. The James-Lange theory, originally proposed in the 1880s, suggests that perceiving others' physiological changes—such as trembling or flushed skin—elicits a corresponding in the observer, which in turn leads to the attribution of the being expressed. For instance, observing a person's wide-eyed stare and rapid breathing may trigger the observer's own sympathetic activation, interpreted as , thereby aiding through embodied . This extension to relies on facial , where subtle of the observed expression generates internal physiological feedback that matches and confirms the . In contrast, the Cannon-Bard theory, developed in the , argues for simultaneous activation in the upon perceiving emotional cues, triggering both the observer's physiological response and the conscious of the as innate and parallel processes. This framework highlights that emotion perception does not depend sequentially on bodily feedback but occurs concurrently with autonomic arousal, allowing for rapid, instinctive decoding of others' states without requiring full . These foundational physiological theories underpin the proposed by Damasio in 1994, which posits that bodily feedback signals, or "somatic markers," from simulated emotional states enable rapid detection and decision-making in social contexts, such as attributing intent from subtle arousal cues in others. In applications, these theories explain the role of mirror neurons in empathetic perception, where neural circuits activate both during the observer's own and when witnessing similar expressions in others, facilitating physiological resonance and accurate emotion attribution through shared somatic patterns.

Cognitive and Appraisal Theories

Cognitive and appraisal theories emphasize the role of top-down cognitive processes in emotion perception, where individuals interpret sensory cues through labeling, contextual evaluation, and situational appraisal to infer emotional states in themselves and others. These frameworks contrast with bottom-up physiological approaches by highlighting how ambiguous or expressive signals require cognitive interpretation to be perceived as specific . The , proposed by Schachter and Singer in 1962, posits that emotion perception arises from the interplay of physiological and cognitive labeling based on environmental context. According to this model, an individual first experiences undifferentiated from emotional cues, such as facial expressions or vocal tones, and then attributes a specific emotional label—such as or —depending on the situational cues available. For instance, in their seminal experiment, participants injected with epinephrine (inducing ) interpreted their physiological state as when exposed to a euphoric confederate or as in the presence of an angry one, demonstrating how context shapes emotional perception. This theory underscores that without of the context, alone does not specify the perceived . Building on such ideas, appraisal theories, particularly those developed by Lazarus in the 1980s, argue that emotion perception involves evaluating the personal relevance and implications of perceptual cues within a given situation. In Lazarus's cognitive-motivational-relational theory, perceivers appraise stimuli along dimensions like goal relevance, coping potential, and novelty, which determine the inferred emotion; for example, a facial frown might be appraised as indicating anger if perceived in a competitive context threatening one's goals, but as sadness in a scenario of personal loss. This process is dynamic and iterative, allowing for rapid adjustments in emotion perception as new contextual information emerges. Appraisal thus serves as a mechanism for disambiguating vague emotional signals, integrating prior knowledge with current sensory input. Recent integrations of these theories with the , advanced by Barrett and updated through 2025, further refine this cognitive perspective by viewing emotion perception as a category-based shaped by cultural concepts and prior experiences. In this framework, perceivers do not passively detect innate emotional universals but actively construct perceptions by fitting sensory cues into learned emotional categories, influenced by linguistic and social learning. For example, the same ambiguous might be categorized as "anxiety" in a high-stakes professional setting due to cultural emphasis on performance pressures, illustrating how constructed categories guide appraisal. These updates highlight the predictive nature of in emotion perception, where appraisals draw on interoceptive signals and exteroceptive contexts to generate situated emotional meanings. Applications of cognitive and appraisal theories explain perceptual biases, such as fundamental attribution errors, where observers over-rely on internal dispositions rather than situational factors when interpreting ambiguous emotional cues like expressions. In social interactions, this can lead to misperceptions, as when a face is appraised as hostile due to primed negative contexts, affecting interpersonal judgments and responses. Overall, these theories provide a robust account of how transforms raw perceptual data into meaningful emotional insights, with implications for understanding variability in everyday .

Neural Mechanisms

Key Brain Regions

Emotion perception relies on a distributed that integrates sensory cues across cortical regions, with the ventral visual stream in the primarily responsible for identifying emotional stimuli such as facial expressions, while the dorsal stream in the contributes to contextualizing these cues through spatial and action-oriented processing. This dual-stream architecture allows for efficient decoding of emotional signals by separating from dynamic interaction assessment. The fusiform face area (FFA), located in the ventral temporal cortex, is specialized for processing faces and plays a critical role in detecting facial emotions by analyzing configural and featural aspects of expressions. Activation in the FFA increases differentially for emotional versus neutral faces, underscoring its involvement in rapid emotion identification. The superior temporal sulcus (STS), situated at the temporoparietal junction, integrates dynamic social cues such as gaze direction, gestures, and biological motion to facilitate emotion perception in interactive contexts. The posterior STS, in particular, responds to multimodal emotional signals, enabling the inference of others' affective states from subtle nonverbal behaviors. Lesion studies show that damage to the FFA causes , impairing face identity recognition, but facial emotion recognition is often preserved, while vocal emotion perception relies on distinct auditory pathways in temporal regions. These regions form part of a broader interconnectivity framework, with feedback loops between cortical areas like the FFA and and subcortical structures enabling iterative refinement of emotional signals for swift behavioral responses. This bidirectional communication supports the modulation of perception by emotional salience, as seen in interactions with limbic areas.

Amygdala and Limbic System

The serves as a central hub in the for the rapid detection of emotional salience, particularly threats, through a subcortical pathway that enables processing of fear-related stimuli such as fearful faces within 20-30 milliseconds. This fast route involves direct projections from the to the , bypassing slower cortical processing to prioritize urgency in emotional . The basolateral complex of the , in particular, facilitates associative learning by encoding emotional and forming connections between neutral stimuli and affective outcomes, thereby modulating subsequent perceptual responses. Within the broader limbic circuit, the integrates with the to contextualize emotions through , enhancing the recall and of emotionally charged events by linking sensory inputs to prior experiences. It also interacts with the insula to incorporate interoceptive signals, such as bodily states of , which refine the of emotional and . A direct thalamo- pathway underscores this subcortical efficiency, allowing immediate emotional tagging of stimuli without full conscious awareness. In patients with , the remains responsive to emotional facial expressions presented in their blind field, demonstrating preserved subcortical processing of affective cues despite the absence of involvement. Recent fMRI meta-analyses have confirmed hyperactivation in individuals with anxiety disorders during negative processing, which biases toward threatening stimuli and contributes to heightened vigilance.

Hypothalamic-Pituitary-Adrenal Axis

The hypothalamic-pituitary-adrenal (HPA) axis is a central neuroendocrine system that coordinates the body's response to stress, beginning with the release of corticotropin-releasing hormone (CRH) from the hypothalamus, which stimulates the anterior pituitary gland to secrete adrenocorticotropic hormone (ACTH). ACTH then prompts the adrenal cortex to produce and release cortisol, the primary glucocorticoid hormone that mobilizes energy resources and modulates physiological responses during stress. This process forms a negative feedback loop, where elevated cortisol levels inhibit further CRH and ACTH secretion to restore homeostasis, thereby influencing states of vigilance and alertness that underpin emotional processing. In the context of emotion perception, acute elevations in heighten sensitivity to potential threats by enhancing the detection of negative or ambiguous emotional cues, such as interpreting surprised expressions as more fearful, while simultaneously impairing the accurate of emotions. This arises from cortisol's rapid effects on attentional mechanisms, increasing emotional from aversive stimuli and facilitating quicker toward danger signals in the . For instance, individuals under acute show reduced accuracy in color-naming tasks when distracted by threat-related words, reflecting amplified perceptual prioritization of negative . Chronic stress dysregulates the axis by blunting mechanisms, resulting in sustained hypercortisolemia that fosters a persistent toward negative emotion perception, as evidenced by longitudinal studies tracking academic in adolescents over several months. In these investigations, prolonged activation correlated with increased attentional vigilance to threatening facial expressions and reduced hedonic processing of positive cues, perpetuating cycles of emotional hyperreactivity. The HPA axis maintains a bidirectional interaction with the amygdala, where emotional stimuli processed by the amygdala can trigger CRH release to activate the axis, while cortisol in turn modulates amygdalar reactivity to refine threat appraisal. Somatic cues, such as physiological arousal from pain or fatigue, further activate the HPA axis, amplifying perceptual sensitivity to emotionally salient signals in the body. Cortisol specifically modulates prefrontal cortex activity to regulate emotional processing, exerting time-dependent effects that initially enhance limbic-prefrontal connectivity for threat prioritization but later suppress prefrontal responses to facilitate recovery and cognitive control over intense emotions. This modulation occurs primarily through glucocorticoid receptors in the prefrontal cortex, which dampen excessive emotional interference during memory consolidation and decision-making under stress.

Individual and Contextual Variations

Developmental Aspects

Emotion perception abilities begin to emerge in infancy, with basic of emotional expressions developing between 3 and 6 months of age. By around 3 to 4 months, infants can visually discriminate expressions such as from neutral or other states, and by 5 months, they distinguish from negative emotions like , , or . This early progresses to and of specific emotions, enhanced by interactions in attachment relationships that provide repeated exposure to caregivers' emotional cues, fostering more mature perceptual sensitivities. During childhood and , accuracy in emotion perception improves steadily through accumulated social experiences, enabling better identification of subtle and complex expressions. Children as young as age can recognize basic emotions like , , , and , with performance increasing across childhood for various modalities such as faces and voices. Pubertal hormonal changes further influence this , with evidence showing shifts in recognition of certain emotions like and tied to pubertal status. These improvements continue into late , supporting enhanced social interactions. In adulthood, emotion perception reaches its peak during mid-life, reflecting optimized integration of perceptual and cognitive processes. However, aging is associated with declines in accuracy, particularly for negative or less familiar emotions, attributable to sensory degradation and changes in neural processing efficiency. Despite these declines, recognition of familiar positive emotions often remains relatively preserved, contributing to a positivity bias in older adults. Several factors influence the trajectory of emotion perception development. plays a key role by facilitating emotion labeling, which enhances and understanding from toddlerhood onward, as children learn to associate words with perceptual cues. Similarly, social exposure shapes perceptual norms, with early emotional environments determining how children interpret and respond to affective signals. Recent longitudinal data from training interventions, including a 2025 systematic review and , indicate that targeted programs can improve children's with medium effect sizes, with sustained effects in some cases. These developmental patterns show parallels across cultures, though societal variations modulate expression norms.

Cross-Cultural Differences

While basic emotions such as happiness, sadness, anger, fear, disgust, and surprise are recognized across diverse cultures with accuracies significantly above chance, cultural influences introduce notable variability in the perception of their intensity, subtlety, and contextual meaning. Foundational research by Ekman and colleagues demonstrated this partial universality through studies in remote and literate societies, but recent systematic reviews confirm that recognition accuracy decreases as the cultural distance between perceiver and expresser increases, with non-Western participants often showing lower agreement on Western-posed expressions. These reviews, synthesizing over 100 studies, emphasize that while core affective signals are shared, cultural norms modulate interpretive biases, leading to differences in how emotions are decoded from facial, vocal, and bodily cues. Cultural display rules play a central role in these differences, governing how emotions are expressed and perceived. In individualistic Western cultures, such as those , display rules favor overt and direct emotional expression, facilitating higher accuracy in recognizing intense facial signals within similar groups. Conversely, collectivist East Asian cultures, like and , promote subtlety and suppression of negative emotions to maintain social harmony, resulting in more restrained displays that Western perceivers may misinterpret as neutral or less intense. This divergence impacts intercultural settings, where lack of awareness of display rules reduces emotion recognition accuracy in cross-cultural judgments. For example, observers rate the same negative as less intense than Americans, reflecting ingrained cultural norms. The collectivism-individualism dimension accounts for a substantial portion of variance in emotion perception, with meta-analytic evidence indicating that cultural factors explain differences in accuracy across studies. Collectivist societies prioritize contextual and focus on the eye region for decoding emotions, enhancing sensitivity to relational cues but potentially overlooking isolated features. Individualist societies, by contrast, emphasize and holistic face scanning, with greater reliance on mouth movements. also exerts influence, as the lexical structure of emotion terms shapes ; for instance, languages with finer distinctions for certain affects, like German's "" for pleasure at others' misfortune, facilitate more nuanced perception compared to languages lacking direct equivalents. Exposure to diverse emotional cues through or further mitigates biases, improving accuracy over time. Cultural norms influence the perception of complex emotions like , with variations in endorsement across individualistic and collectivist contexts. Although the emotion appears universal, cultural norms modulate its salience in different social environments. Overall, these patterns underscore the interplay between innate perceptual mechanisms and learned social conventions in shaping emotion understanding.

Disorders and Impairments

Neurodevelopmental Disorders

Individuals with exhibit notable deficits in emotion perception, particularly in recognizing facial and vocal expressions due to under-detection of subtle . These impairments often manifest as difficulties in identifying complex emotions such as or fear, leading to reduced accuracy in socioemotional processing compared to neurotypical individuals. Such deficits are linked to hypoactivation during emotional face processing, which contributes to atypical gaze patterns and diminished responsiveness to emotional stimuli. In addition, multimodal studies reveal impaired recognition of emotions from adult faces and child voices, with some subgroups showing intact performance but overall slower latencies. In attention-deficit/hyperactivity disorder (ADHD), emotion perception is influenced by , resulting in hasty and less accurate interpretations of emotional signals. Children with ADHD demonstrate decreased overall accuracy, particularly for negative emotions like and , alongside that exacerbates perceptual errors. biases in ADHD often favor positive emotional stimuli, with sustained attentional allocation toward rewarding or appetitive cues, which can interfere with balanced processing of mixed emotional contexts. These perceptual challenges in ASD and ADHD are underpinned by mechanisms such as impaired (ToM), which hinders the attribution of mental states to others and links directly to deficits across neurodevelopmental disorders. Sensory integration issues further compound these problems, as atypical processing of visual and auditory inputs disrupts the holistic of emotional expressions in both conditions. A significant overlap exists with alexithymia, prevalent in approximately 50% of autistic individuals, which reduces emotional granularity—the ability to differentiate nuanced feelings—and impairs self- and other-emotion identification. This comorbidity limits the precision of emotional vocabulary and exacerbates under-detection of subtle cues. Recent interventions, including digital emotion training programs tailored for ASD youth, have shown promise in enhancing recognition accuracy. For instance, app-based training has led to improvements in emotion identification skills, with some studies reporting gains exceeding 20% in accuracy post-intervention among adolescents with ASD. These tools target social cue detection and ToM, offering structured practice to mitigate core deficits.

Psychiatric Conditions

In (MDD), individuals exhibit a pronounced negative bias in emotion perception, leading to over-identification of and in facial expressions while showing reduced detection of positive emotions such as . This bias persists across various emotional recognition tasks and contributes to social withdrawal and interpersonal difficulties. Meta-analyses confirm deficits in recognizing multiple basic emotions, except for , which aligns with heightened sensitivity to negative cues. In , emotion perception impairments often manifest as paranoia-driven misattributions, where neutral faces are frequently interpreted as hostile or angry, particularly among actively paranoid patients. This pattern of errors reflects broader deficits in , exacerbating suspiciousness and relational conflicts. Additionally, deficits extend to affective prosody recognition, with patients showing reduced accuracy in interpreting emotional tone in speech, independent of facial cues. Anxiety disorders are characterized by to threat-related stimuli, resulting in exaggerated of danger in ambiguous or neutral emotional signals, associated with dysregulation of the axis. This heightened sensitivity promotes avoidance behaviors and sustained arousal. Underlying these perceptual biases in mood and psychotic disorders are mechanisms such as imbalances, which disrupt the regulation of emotional salience and contribute to recognition deficits in conditions like . Cognitive distortions further amplify these issues by systematically skewing interpretation toward negative or threatening attributions. treatments have been shown to normalize alterations in in MDD, potentially mitigating these biases.

Research Methods

Behavioral Paradigms

Behavioral paradigms in emotion perception involve experimental tasks designed to evaluate individuals' ability to recognize and interpret emotional expressions through observable responses, such as identification accuracy and decision speed, without incorporating physiological recordings. These methods rely on controlled presentations of emotional stimuli to probe perceptual sensitivities, often using standardized sets of facial, vocal, or bodily cues. Seminal tasks include the Ekman 60-Faces Test, which presents static photographs of posed facial expressions depicting six basic emotions—, , , , , and surprise—forcing participants to select the matching label from options, thereby assessing recognition of prototypical expressions. Another widely used paradigm is the Reading the Mind in the Eyes Test (RMET), which focuses on subtle social emotions by displaying cropped images of eye regions and requiring selection of the most appropriate or emotion from four alternatives, emphasizing theory-of-mind components in emotion inference. Stimuli in these paradigms vary in format to capture different aspects of emotional processing, including static images for discrete identification, dynamic videos for naturalistic sequences, and morphed transitions blending expressions to explore perceptual gradients. Posed stimuli, such as those in the Ekman series, feature deliberate enactments of , which facilitate high rates but may lack real-world due to exaggerated . In contrast, spontaneous stimuli, elicited during genuine emotional experiences, better approximate everyday interactions and have gained emphasis in recent research for their , revealing differences in patterns where negative are more accurately detected in posed formats, while positive ones fare better in spontaneous ones. Performance is quantified through measures like accuracy (proportion of correct identifications), response time (latency to categorize an expression), and bias scores (tendency to over- or under-attribute specific emotions, such as ). Paradigms employ either forced-choice formats, where participants select from predefined labels to minimize , or rating scales, allowing nuanced judgments of emotional intensity or . Morphing paradigms, in particular, reveal perceptual boundaries by gradually transitioning between emotions like and , enabling identification of categorical thresholds where shifts discretely despite continuous stimulus changes. To enhance validity, cross-modal tasks integrate multiple sensory channels, such as pairing facial expressions with vocal tones, to assess how auditory cues influence visual emotion perception and promote multimodal integration. Cultural adaptations of these paradigms, such as modifying stimulus sets to include region-specific expressions or norms, ensure applicability across diverse populations by accounting for variations in and recognition thresholds. Neuroimaging techniques can complement these behavioral assessments by revealing underlying neural correlates during task performance.

Neuroimaging Techniques

Neuroimaging techniques have become essential for investigating the neural underpinnings of emotion perception, providing insights into regions and processes involved in decoding emotional stimuli such as facial expressions and vocal tones. These methods allow researchers to map spatial and temporal patterns of activity, revealing how emotions are processed at both conscious and automatic levels. Functional magnetic resonance imaging (fMRI) is widely used to measure blood-oxygen-level-dependent (BOLD) responses during exposure to emotional stimuli, highlighting activation in the as a key hub for rapid emotion detection, particularly for and threat-related cues. Studies employing fMRI have demonstrated that the exhibits heightened responses to negative emotional faces compared to neutral ones, with subregional specialization where the basolateral amygdala processes sensory inputs and the central coordinates autonomic outputs. This technique's high enables precise localization of emotion-related networks, though its is limited to seconds. Electroencephalography (EEG) and event-related potentials (ERPs) offer superior , capturing millisecond-scale dynamics of emotion perception, such as the N170 component, which is enhanced for emotional facial expressions like or relative to neutral faces. The N170, originating from occipitotemporal regions, reflects early structural encoding of faces modulated by emotional content, providing evidence for rapid, automatic processing in the . These methods are particularly valuable for dissecting the sequence of perceptual stages, from low-level feature detection to higher-order emotional appraisal. Recent advances in portable (fNIRS) have enabled real-world studies of emotion perception outside controlled lab settings, leveraging its non-invasive, motion-tolerant design to monitor prefrontal and temporal activity during naturalistic emotional interactions as of 2025. fNIRS detects hemodynamic changes similar to fMRI but in a wearable format, facilitating investigations of dynamic emotion processing in diverse environments, such as social scenarios. Magnetoencephalography (MEG) excels in source localization of emotional responses, combining high temporal precision with spatial accuracy to track oscillatory activity and evoked fields during , often revealing amygdala-prefrontal interactions modulated by gaze and cues. For instance, studies localize early effects to and superior temporal regions around 170 ms post-stimulus, aiding in understanding distributed networks. Positron emission tomography (PET) provides insights into neurotransmitter roles in emotion perception by quantifying receptor binding and metabolic activity, such as dopamine and opioid system involvement in reward-related emotional processing. PET has shown that serotonin transporter availability correlates with sensitivity to negative emotions, linking molecular mechanisms to perceptual biases. These techniques are applied to compare neural patterns in healthy individuals versus those with disorders, revealing atypical amygdala hyperreactivity in anxiety, and to track longitudinal changes in emotion processing across or interventions. Behavioral paradigms, such as emotional face tasks, serve as standardized stimuli to elicit these responses for analysis.

References

  1. [1]
    Emotion Perception - an overview | ScienceDirect Topics
    Emotion perception is defined as the ability to recognize and interpret emotional cues expressed by others, which involves rapid neural responses to ...
  2. [2]
    Emotion Perception from Face, Voice, and Touch - PubMed Central
    Here, we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing ...<|control11|><|separator|>
  3. [3]
    The neural basis of normal emotion perception - PubMed
    We have aimed to identify potential neural correlates of three processes suggested by appraisalist theories as important for emotion perception.
  4. [4]
    Emotion perception across cultures: the role of cognitive mechanisms
    Mar 11, 2013 · Together, culture-specific cognitive styles can account for some of the cultural differences in emotion perception commonly observed in past ...Abstract · Introduction · Cognitive Mechanisms 1... · Cognitive Mechanisms 2...
  5. [5]
    Social Cognition through the Lens of Cognitive and Clinical ...
    Social cognition refers to a set of processes, ranging from perception to decision-making, underlying the ability to decode others' intentions and behaviors.
  6. [6]
    [PDF] The Expression of the Emotions in Man and Animals - Darwin Online
    Page 1. THE EXPRESSION OF THE. EMOTIONS IN MAN AND ANIMALS. BY. CHARLES DARWIN. M. A.,. F. R. S.,. ETC. WITH PHOTOGRAPHIC AND OTHER ILLUSTRATIONS. NEW YORK. D ...
  7. [7]
  8. [8]
    Constants across cultures in the face and emotion. - APA PsycNet
    Constants across cultures in the face and emotion. Citation. Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the face and emotion.Missing: tribe | Show results with:tribe
  9. [9]
    A history of the face in psychological research on emotion perception.
    In the present chapter, we use a historical lens to examine how the face has been understood, and studied, in relation to emotion, with an emphasis on ...
  10. [10]
    How Emotions Are Made | Lisa Feldman Barrett
    Instead, she has shown that emotion is constructed in the moment, by core systems that interact across the whole brain, aided by a lifetime of learning. This ...
  11. [11]
    The nonverbal expression of guilt in healthy adults | Scientific Reports
    May 8, 2024 · This study explored facial expression, gesture, posture, and gaze during the real-time experience of guilt when response demands are minimal.
  12. [12]
    Fast saccades toward faces: Face detection in just 100 ms | JOV
    Here we show that saccades toward human faces are even faster, with the earliest reliable saccades occurring in just 100–110 ms, and mean reaction times of ...
  13. [13]
    Rapid perceptual integration of facial expression and emotional ...
    Nov 8, 2005 · Here we show that observers judging a facial expression are strongly influenced by emotional body language.
  14. [14]
    How Context Influences Our Perception of Emotional Faces - Frontiers
    Oct 3, 2017 · Our study used a more ecological design with participants watching film sequences of neutral faces, crosscut with scenes of strong emotional content.Introduction · Materials and Methods · Statistical Analysis and Results · Discussion
  15. [15]
    Emotions are perceived differently from posed and spontaneous ...
    Our findings present clear evidence that perceptions of posed and spontaneous facial expressions meaningfully differ.
  16. [16]
    Emotions Are Perceived Differently From Posed and Spontaneous ...
    Specifically, negative emotions were better recognized than positive emotions from posed expressions, while the opposite was true for spontaneous expressions, ...
  17. [17]
    Trainee psychotherapists' emotion recognition accuracy improves ...
    Trainee psychotherapists' emotion recognition accuracy improves after training: emotion recognition training as a tool for psychotherapy education.Missing: expertise | Show results with:expertise
  18. [18]
    A Survey of Automatic Facial Micro-Expression Analysis - Frontiers
    Compared to ordinary facial expressions or macro-expressions, MEs usually last for a very short duration which is between 1/25 and 1/5 of a second (Ekman, 2009b) ...
  19. [19]
    Micro Expressions | Facial Expressions - Paul Ekman Group
    Micro expressions are facial expressions that occur within a fraction of a second. This involuntary emotional leakage exposes a person's true emotions.
  20. [20]
    [PDF] Acoustic Profiles in Vocal Emotion Expression - Columbia University
    Professional actors' portrayals of 14 emotions varying in intensity and valence were presented to judges. The results on decoding replicate earlier findings ...
  21. [21]
    Perceptual cues in non-verbal vocal expressions of emotion - NIH
    Banse and Scherer (1996) regressed acoustic parameters onto participants' use of the emotion categories in a forced-choice task with a range of different ...<|separator|>
  22. [22]
    Reduced sensitivity to emotional prosody in congenital amusia ...
    More generally, music and speech are both auditory signals that acquire meaning through changes in attributes such as pitch, timing, intensity, and timbre.Abstract · Sign Up For Pnas Alerts · Materials And Methods
  23. [23]
    Emotional Speech Processing at the Intersection of Prosody and ...
    Our data show that emotional speech cues produce robust congruency effects on decisions about an emotionally related face target.
  24. [24]
    Prosody and Semantics Are Separate but Not Separable Channels ...
    Our aim is to explore the complex interplay of prosody (tone of speech) and semantics (verbal content) in the perception of discrete emotions in speech.Prosody And Semantics Are... · Perception Of Spoken... · Assessment Of Spoken...
  25. [25]
    The expression and recognition of emotions in the voice across five ...
    Dec 23, 2014 · A lens model analysis of fundamental acoustic properties examined patterns in emotional expression and perception within and across groups.The Lens Model Approach To... · Perception Of Emotion · Emotion Recognition Accuracy
  26. [26]
    Preferential Amygdala Reactivity to the Negative Assessment ... - PMC
    These results suggest that the subjective perception of neutral and possibly emotionally ambiguous stimuli as aversive may modulate amygdala activity.
  27. [27]
    (PDF) Cultural differences in on-line sensitivity to emotional voices
    PDF | Evidence that culture modulates on-line neural responses to the emotional meanings encoded by vocal and facial expressions was demonstrated.
  28. [28]
    Cross-cultural recognition of basic emotions through nonverbal ...
    We examined the recognition of nonverbal emotional vocalizations, such as screams and laughs, across two dramatically different cultural groups.
  29. [29]
  30. [30]
    Reexamining the neural network involved in perception of facial ...
    The ventral stream plays a key role in processing emotional facial expressions. · The left fusiform face area engages in bottom-up facial expression processing.
  31. [31]
    A short review on emotion processing: a lateralized network of ...
    Jul 3, 2021 · Emotion processing involves the coordinated activation of multiple large-scale neuronal networks encompassing both cortical and subcortical ...
  32. [32]
    What Visual Information Is Processed in the Human Dorsal Stream?
    Jun 13, 2012 · The idea of a division between a dorsal and a ventral visual stream is one of the most basic principles of visual processing in the brain ...Missing: emotion | Show results with:emotion
  33. [33]
    The Fusiform Face Area: A Module in Human Extrastriate Cortex ...
    Jun 1, 1997 · We found an area in the fusiform gyrus in 12 of the 15 subjects tested that was significantly more active when the subjects viewed faces than when they viewed ...<|separator|>
  34. [34]
    Emotional expressions evoke a differential response in the fusiform ...
    Oct 28, 2013 · It is widely assumed that the fusiform face area (FFA), a brain region specialized for face perception, is not involved in processing emotional ...
  35. [35]
    Functional Organization of Social Perception and Cognition in ... - NIH
    Jun 5, 2015 · The superior temporal sulcus (STS) is considered a hub for social perception and cognition, including the perception of faces and human ...
  36. [36]
    A Causal Role of the Right Superior Temporal Sulcus in Emotion ...
    Dec 1, 2017 · These results support the causal role of the right pSTS in decoding information about others' emotional state from their body movements and gestures.Abstract · INTRODUCTION · MATERIALS AND METHOD · DISCUSSION
  37. [37]
    Lesions of the fusiform face area impair perception of facial ...
    Perception of facial configuration is impaired in patients with prosopagnosia whose lesions involve the right fusiform gyrus.
  38. [38]
    A modulatory role for facial expressions in prosopagnosia - PNAS
    Brain-damaged patients experience difficulties in recognizing a face (prosopagnosics), but they can still recognize its expression.
  39. [39]
    Emotion processing and the amygdala: from a 'low road' to 'many ...
    A subcortical pathway through the superior colliculus and pulvinar to the amygdala is commonly assumed to mediate the non-conscious processing of affective ...Missing: subcortex | Show results with:subcortex
  40. [40]
    Hypothalamic-Pituitary-Adrenal (HPA) Axis: Unveiling the Potential ...
    Aug 23, 2024 · The hypothalamic-pituitary-adrenal (HPA) axis plays a pivotal role in the body's response to stress, orchestrating the release of ...
  41. [41]
    Cortisol responses enhance negative valence perception for ... - PMC
    Nov 8, 2017 · Elevations in cortisol were associated with more negative ratings of surprised faces, and with more direct response trajectories toward negative ...
  42. [42]
    Time-dependent effects of cortisol on selective attention ... - Frontiers
    Aug 27, 2012 · Whereas high circulating corticosteroid levels acutely increase emotional interference, possibly facilitating the detection of threats, a ...
  43. [43]
    The effect of chronic academic stress on attentional bias towards ...
    Oct 26, 2024 · Participants in the stress group showed higher levels of perceived stress, state anxiety, and negative affect compared with the control group.
  44. [44]
    Influence of the HPA Axis on Anxiety-Related Processes - PMC - NIH
    Aug 30, 2025 · Through a multidimensional lens, we review the literature on the link between anxiety-related processes, hypothalamic-pituitary-adrenal ...
  45. [45]
    The Role of the Amygdala in Regulating the Hypothalamic-Pituitary ...
    Jul 5, 2017 · Our findings showed that (1) lesions of the central amygdala inhibited the HPA axis responses to a variety of stressful stimuli.
  46. [46]
    Modulatory mechanisms of cortisol effects on emotional learning ...
    Jul 8, 2013 · Although no behavioral effects of cortisol were found, cortisol's slow effects reduced prefrontal and hippocampal responses, while no ...
  47. [47]
    Three-month-old infants show enhanced behavioral and neural ...
    Jan 28, 2020 · Infants' ability to perceive, discriminate, and interpret facial expressions of emotion is critical for infant-caregiver interaction and the ...
  48. [48]
  49. [49]
    Development of body emotion perception in infancy
    These results document a developmental change from discrimination based on non-emotional information at 3.5 months to recognition of body emotions at 5 months.
  50. [50]
    Infant-parent attachment: Definition, types, antecedents ... - PMC - NIH
    Attachment is one specific aspect of the relationship between a child and a parent with its purpose being to make a child safe, secure and protected.Missing: perception | Show results with:perception
  51. [51]
    Emotion recognition development: Preliminary evidence for an effect ...
    While emotion recognition is shaped through social interactions from a child's early years through at least late adolescence, no emphasis has thus far been ...
  52. [52]
    Categorical emotion recognition from voice improves during ... - Nature
    Oct 4, 2018 · Converging evidence demonstrates that emotion processing from facial expressions continues to improve throughout childhood and part of adolescence.Missing: congruent | Show results with:congruent<|separator|>
  53. [53]
    (PDF) Age, gender, and puberty influence the development of facial ...
    Jun 16, 2015 · In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in ...
  54. [54]
    Facial and Vocal Emotion Recognition in Adolescence: A Systematic ...
    Jun 13, 2023 · This systematic review aimed to clarify the pattern of recognition for facial and vocal emotion expressions, and the relationship of performance to different ...
  55. [55]
    Older adults' perception of social and emotional cues. - APA PsycNet
    Adult aging influences the decoding of social and emotional cues. Older adults perform worse than younger adults in labeling some types of emotional ...
  56. [56]
    Effects of aging on emotion recognition from dynamic multimodal ...
    Jan 29, 2021 · Age-related differences in emotion recognition have predominantly been investigated using static pictures of facial expressions, ...
  57. [57]
    Aging and Emotion Recognition: Not Just a Losing Matter - PMC
    Past studies on emotion recognition and aging have found evidence of age-related decline when emotion recognition was assessed by having participants detect ...
  58. [58]
    [PDF] The role of language in emotional development Holly Shablack and ...
    Next, we discuss how language acquisition throughout toddlerhood and early childhood leads to increased emotion understanding and more nuanced emotion ...
  59. [59]
    How the Emotional Environment Shapes the Emotional Life of ... - NIH
    This review explores how variation in children's received emotional input shapes their emotion understanding and their emotional behavior over the course of ...
  60. [60]
    Facial Emotion Recognition Trainings for Children and Adolescents
    Jul 9, 2025 · This study systematically reviewed and compared the efficacy of facial emotion recognition training programmes for autistic and nonautistic children and ...
  61. [61]
  62. [62]
    Culture shapes preschoolers' emotion recognition but not emotion ...
    Jan 11, 2022 · This study investigated the two aspects of preschoolers' emotion understanding, namely emotion recognition and emotion comprehension, in a cross-cultural ...
  63. [63]
  64. [64]
  65. [65]
    High achievers, Schadenfreude and Gluckschmerz in New ... - NIH
    Cultural differences. Although some cultures may lack words for the emotions described as Schadenfreude and Gluckschmerz, it has been argued that these emotions ...Missing: perception | Show results with:perception
  66. [66]
    Emotion recognition deficits in children and adolescents with autism ...
    Jan 14, 2025 · Research consistently shows that children and adolescents with ASD struggle to recognize subtle or complex emotions, such as sadness or fear, ...
  67. [67]
    Anxiety and social deficits have distinct relationships with amygdala ...
    Results indicate that hypoactivation of amygdala in ASD, a suggestive finding first reported nearly 20 years ago, can be masked by comorbid anxiety.Mri Data Acquisition... · Group-Level Fmri Analyses · Per-Voxel Analysis
  68. [68]
    Multimodal emotion processing in autism spectrum disorders
    Children with ASD exhibited impaired emotion recognition performance for adult faces and child voices, with a subgroup displaying intact recognition. Latencies ...
  69. [69]
    Emotion Recognition Accuracy Among Individuals With ADHD
    Nov 30, 2024 · 58% of retrieved articles reported significantly decreased emotion recognition accuracy among individuals with ADHD relative to neurotypical ...
  70. [70]
    Positive Emotional Attention Bias in Young Children With Symptoms ...
    Jan 18, 2018 · Emotional attention biases are associated with a number of adverse socioemotional outcomes including reward sensitivity and externalizing ...
  71. [71]
    [PDF] Theory of Mind deficits in childhood mental and ...
    Research results indicate the presence of ToM deficits in childhood mental and neurodevelopmental disorders, such as: autism spectrum disorders, attention ...
  72. [72]
    Relationships between Sensory Processing and Executive ... - PMC
    Jun 2, 2024 · The heightened emotional problems observed in ASD+ADHD children may be associated with more prominent atypical sensory processing. Variance ...
  73. [73]
    Alexithymia & autism guide - Embrace Autism
    Jan 27, 2020 · 40–65% of autistics have alexithymia, a condition characterized by challenges with identifying & describing emotions in the self, and more.
  74. [74]
    An Exploratory Analysis of Alexithymia in Adults with Autism Utilising ...
    Apr 8, 2022 · ... emotionally stimulating scenarios and had less emotional granularity. Affective word use was correlated with ASD symptomatology but not with ...
  75. [75]
    Feasibility of internet-based multimodal emotion recognition training ...
    Jul 10, 2025 · The study found iMERAT feasible for adolescents with ASD, with increased emotion recognition, but the small sample size limits conclusions.Missing: percentage | Show results with:percentage
  76. [76]
    The Preliminary Efficacy of Emotion Regulation Skills Training for ...
    Aug 26, 2024 · Results indicated that SIERA was an efficacious intervention for autistic young adults in terms of improving emotion regulation skills.Missing: accuracy | Show results with:accuracy
  77. [77]
    Meta-analysis of emotion recognition deficits in major depressive ...
    Nov 14, 2014 · These findings suggest that the ER impairment reported in the depression literature exists across all basic emotions except sadness.Introduction · Results · Discussion
  78. [78]
    Facial emotion recognition in patients with depression compared to ...
    Apr 12, 2023 · Negative bias in the perception of others' facial emotional expressions in major depression: The role of depressive rumination. J. Nerv ...Results · Excessive Rumination And... · Data Collection Procedure
  79. [79]
    Mood-Related Negative Bias in Response to Affective Stimuli in ...
    Results: MD patients showed a broad impairment of emotion recognition. Patients' responses to happy faces suggested a negativity bias, which also became evident ...<|control11|><|separator|>
  80. [80]
    Meta-analysis of emotion recognition deficits in major depressive ...
    These findings suggest that the ER impairment reported in the depression literature exists across all basic emotions except sadness.Table 1 · Er In Mdd · Table 3
  81. [81]
    Actively Paranoid Patients with Schizophrenia Over Attribute Anger ...
    Previous investigations of the influence of paranoia on facial affect recognition in schizophrenia have provided conflicting results. Some studies support an ...
  82. [82]
    Impaired Facial Emotion Recognition in Individuals at Ultra-High ...
    Jun 25, 2020 · Patients with schizophrenia and individuals at ultra-high risk for psychosis (UHR) have been reported to exhibit impaired recognition of facial emotion ...
  83. [83]
    Facial and Prosodic Emotion Recognition Deficits Associate ... - NIH
    Jun 20, 2013 · Patients with schizophrenia perform significantly worse on emotion recognition tasks than healthy participants across several sensory modalities ...Patients · Emotion Recognition · Panss Dimension Scores As...<|separator|>
  84. [84]
    Exaggerated neurobiological sensitivity to threat as a mechanism ...
    Threat perception leads to activation of the hypothalamic-pituitary-adrenal (HPA) axis leading to increased release of the glucocorticoid hormone cortisol ...
  85. [85]
    Dopaminergic contribution to the regulation of emotional perception
    It may also be implicated in deficits in emotional recognition found in two major disorders where DA's implication is clear: Parkinson disease and schizophrenia ...
  86. [86]
    Antidepressant Treatment-Induced State-Dependent ... - Frontiers
    Jan 5, 2022 · These findings highlighted the state-dependent reconfiguration of emotion regulation networks in MDD patients owing to antidepressant treatment.
  87. [87]
    Interventions for deficits in recognition of emotions in facial ...
    These studies built some evidence that some pharmacological approaches may improve the accuracy in recognizing facial expressions and correcting some negative ...
  88. [88]
    Ekman-Friesen Pictures of Facial Affect Test—Computerized Version
    It consists of 60 full face, uncropped images, 10 each for the emotions of happy, sad, angry, fearful, disgusted, and surprised.
  89. [89]
    [PDF] The ''Reading the Mind in the Eyes'' Test Revised Version
    The Revised Eyes Test has improved power to detect subtle individual differences in social sensitivity. Keywords: Theory of mind, Asperger's Disorder, autistic ...
  90. [90]
    Review: Posed vs. Genuine Facial Emotion Recognition ... - Frontiers
    Jul 8, 2021 · One study found that adults are much more accurate at labeling emotions when the facial expression is posed than when it is spontaneous ( ...
  91. [91]
    Research Needs Spontaneous and Naturalistic Facial Expressions
    Jul 24, 2025 · We review evidence demonstrating that stimuli that are naturally- or spontaneously-elicited and/or appear genuinely emotional can produce different findings.
  92. [92]
    Morphing between expressions dissociates continuous from ... - PNAS
    Dec 3, 2012 · Experiment 2 used morphed images to determine whether different face regions have a continuous and categorical representation of emotion.
  93. [93]
    On the role of crossmodal prediction in audiovisual emotion perception
    Jul 17, 2013 · Cross-modal prediction likely contributes to the ease and efficiency with which others' emotions are recognized. One question that arises is ...
  94. [94]
    Cultural adaptation of the facial emotion perception test for use in ...
    May 28, 2025 · This study highlights the importance of culturally adapting cognitive performance tools that can potentially improve depression treatment outcomes in low- ...
  95. [95]
    Decoding the Nature of Emotion in the Brain - PMC - PubMed Central
    This review examines recent functional neuroimaging studies that use MVPA to investigate how emotions are reflected in distributed patterns of brain activity, ...
  96. [96]
    Common neural correlates of emotion perception in humans - PMC
    Abstract. Whether neuroimaging findings support discriminable neural correlates of emotion categories is a longstanding controversy.Missing: techniques | Show results with:techniques
  97. [97]
    Contributions of the Amygdala to Emotion Processing: From Animal ...
    Here, we review research on the role of the amygdala in emotional processes in both animal models and humans.
  98. [98]
    Specialization of amygdala subregions in emotion processing - PMC
    Apr 9, 2024 · Using functional magnetic resonance imaging, we show that the amygdala should be viewed as a group of heterogeneous subregions when processing ...2. Methods · 3. Results · 3.3. Enhanced Connectivity...
  99. [99]
    Amygdala fMRI—A Critical Appraisal of the Extant Literature - PMC
    Aug 13, 2024 · With the advent of fMRI in the early 1990s, the study of amygdala and emotion was no longer confined to animal models, as the early work on ...
  100. [100]
    The face-specific N170 component is modulated by emotional facial ...
    This study examines the time-course and topography of the influence of emotional expression on the N170 response to faces.
  101. [101]
    Beyond facial expressions: A systematic review on effects of ...
    The N170 is the most prominent electrophysiological signature of face processing. While facial expressions reliably modulate the N170, there is considerable ...
  102. [102]
    The N170: Understanding the Time Course of Face Perception in ...
    This chapter reviews the contribution of electromagnetic measures, mostly event-related potentials (ERPs), to our understanding of the time course of face ...The N1, The N170, And The... · The N170: A Tool To... · Notes
  103. [103]
    A Scoping Review of Functional Near-Infrared Spectroscopy (fNIRS ...
    Oct 28, 2025 · Functional Near-Infrared Spectroscopy (fNIRS) has emerged as a valuable tool to investigate cognitive and emotional processes during ...2. Methodology · 4. High Level Analysis · 6. Discussion
  104. [104]
    MEG Evidence for Dynamic Amygdala Modulations by Gaze and ...
    Sep 10, 2013 · Our results show that the amygdala is involved in the processing and integration of emotion and gaze cues from faces in different time ranges.
  105. [105]
    Localizing evoked and induced responses to faces using ... - PMC
    These findings help to establish that MEG beamforming can localize face-specific responses in time, frequency and space with good accuracy (when validated ...
  106. [106]
    Molecular Imaging of the Human Emotion Circuit - NCBI
    Nov 29, 2022 · In sum, targeting neurotransmitter mechanisms of emotions using PET is a powerful tool for dissecting the molecular mechanisms of emotions, ...Introduction · Molecular Imaging with... · The Dopamine System · Opioid System
  107. [107]
    Molecular Imaging of the Human Emotion Circuit - SpringerLink
    Nov 29, 2022 · In sum, targeting neurotransmitter mechanisms of emotions using PET is a powerful tool for dissecting the molecular mechanisms of emotions, ...The Dopamine System · Opioid System · Serotonergic System
  108. [108]
    Advances in Neuroimaging and Deep Learning for Emotion Detection
    Background/Objectives: The following systematic review integrates neuroimaging techniques with deep learning approaches concerning emotion detection.