Fact-checked by Grok 2 weeks ago

Facial expression

Facial expressions are the observable movements of that convey a person's internal emotional states, intentions, or social communications, serving as a fundamental form of non-verbal communication in interactions. They typically involve changes in features such as the eyes, eyebrows, , , and , and are recognized across cultures for emotions including , , , , , and . The scientific study of facial expressions originated with Charles Darwin's 1872 publication The Expression of the Emotions in Man and Animals, which proposed an evolutionary basis for these displays, arguing they evolved from adaptive behaviors like "serviceable habits" (e.g., baring teeth in aggression for threat) and the "principle of antithesis" (e.g., contrasting submissive and dominant postures). Darwin emphasized their universality, suggesting that expressions are innate and shared across humans and other animals, challenging religious views of emotions as divinely created and supporting biological continuity among species. In the late , psychologist advanced this work through cross-cultural experiments, demonstrating high agreement in from facial cues among diverse groups, including isolated preliterate societies in , thus confirming the existence of universal facial signals for core emotions. Facial expressions play a critical role in by facilitating —where observers automatically mimic seen expressions, sharing affective states like more readily than negative ones—and influencing , such as enhancing for emotionally congruent information or guiding risk-averse decisions in response to fearful cues. Recognition accuracy can vary by context, with deficits observed in conditions like , particularly for negative emotions, highlighting their importance in social bonding and . Research tools, including Ekman's (FACS), enable precise measurement of muscle actions (action units) to distinguish genuine from posed expressions, supporting applications in , , and . While basic expressions appear innate, cultural norms can modulate their display and interpretation, blending universal with learned social rules.

Biological Foundations

Anatomy of Facial Muscles

The human face is equipped with approximately 43 skeletal muscles responsible for generating a wide array of expressions, most of which are innervated by the (cranial nerve VII). These muscles are unique among skeletal muscles in that they primarily insert into rather than , allowing them to manipulate facial features for . The provides motor innervation to these muscles, enabling precise control over subtle movements. Facial muscles can be classified into core emotional expressors, which are associated with the basic emotions outlined by —such as , , , , , and —and action units as defined in the (FACS). Developed by psychologists and Wallace V. Friesen in 1978, FACS provides a standardized, anatomically based method for coding visible facial movements into 44 action units, each corresponding to specific muscle activations or combinations. This system builds on Darwin's observations by linking muscle actions to universal emotional displays, facilitating objective analysis of expressions. Key facial muscles contribute distinct actions to emotional expressions through their origins, insertions, and contractions. For instance, the orbicularis oculi originates from the medial orbital margin and lacrimal sac, inserting into the lateral palpebral raphe and tarsal plates; it closes the eyelids, as seen in winking or squinting during emotions like sadness or fear (FACS Action Unit 7). The zygomaticus major, originating from the lateral zygomatic bone and inserting at the modiolus (the fibrous hub at the mouth corner), elevates the corner of the mouth to produce smiling in happiness (FACS Action Unit 12). Similarly, the risorius arises from the deep facial fascia and parotid region, inserting into the modiolus and adjacent skin; it retracts the mouth angle laterally, contributing to smirking or expressions of disdain (FACS Action Unit 20). Other primary muscles include the frontalis, part of the occipitofrontalis complex, which originates from the and inserts into the skin; it raises the eyebrows and wrinkles the , signaling (FACS Action Unit 1). The levator labii superioris originates from the infraorbital and inserts into the upper lip skin, elevating and everting the lip while deepening the , as in the sneer of (FACS Action Unit 10). These muscles often work in concert; for example, coordinated activation of the frontalis and orbicularis oculi widens the eyes in , while the levator labii superioris and zygomaticus major combine to form a contemptuous half-smile.
MuscleOriginInsertionPrimary ActionExample Expression (FACS AU)
Orbicularis oculiMedial orbital margin, lacrimal sacLateral palpebral raphe, tarsal platesCloses eyelidsSadness/fear (AU 7)
Zygomaticus majorLateral zygomatic boneModiolus at mouth cornerElevates mouth cornerHappiness/smiling (AU 12)
RisoriusDeep facial fascia, parotidModiolus, skin at mouth angleRetracts mouth laterallySmirking (AU 20)
FrontalisEpicranial aponeurosisEyebrow skinRaises eyebrows, wrinkles foreheadSurprise (AU 1)
Levator labii superiorisInfraorbital maxillaUpper lip skinElevates/evens upper lipDisgust (AU 10)

Neural Control Mechanisms

The facial nucleus, located in the pons of the brainstem, serves as the primary motor nucleus for the facial nerve (cranial nerve VII), containing lower motor neurons that directly innervate the muscles of facial expression. These neurons are organized somatotopically, with the medial division controlling midline and upper facial muscles (e.g., frontalis and orbicularis oculi) and the lateral division managing lower facial muscles (e.g., zygomaticus major). The nucleus receives inputs from both cortical and subcortical structures, enabling a distinction between voluntary and involuntary expressions: voluntary movements are primarily driven by descending projections from the primary motor cortex (M1), ventral premotor cortex, and supplementary motor area via the corticobulbar tract, while involuntary emotional expressions involve subcortical pathways from limbic structures such as the amygdala. The provides bilateral innervation to the upper face, allowing voluntary control even after unilateral cortical damage, whereas receive predominantly contralateral input, leading to asymmetries in voluntary expressions following hemispheric lesions. In contrast, emotional expressions bypass direct cortical routes through extrapyramidal pathways, including projections from the central nucleus of the to the facial nucleus via the , facilitating rapid, spontaneous responses to affective stimuli. For instance, the -limbic system plays a key role in initiating spontaneous smiles during positive emotional contexts, integrating sensory and motivational inputs to coordinate activity without conscious intent. A prominent example of this is observed in smiles: genuine Duchenne smiles, which engage the to crinkle the eyes, are involuntarily driven by the , reflecting authentic positive affect, whereas social or posed smiles rely on voluntary cortical control and typically spare the upper facial involvement. Neurotransmitters modulate these circuits; , released from midbrain nuclei like the , enhances reward-related expressions by influencing limbic-motor pathways, promoting affiliative behaviors such as smiling in response to pleasurable stimuli. Basic neural circuits for facial expression control can be outlined as follows: These pathways ensure adaptive expression of both intentional and affective states.

Expressive Asymmetries

Facial expressive asymmetries refer to differences in the or of emotional displays between the left and right sides of the face, often manifesting as stronger expressions on the left hemiface. This phenomenon is attributed to the brain's hemispheric , particularly the right hemisphere's dominant role in and producing emotions. Due to contralateral neural , the right hemisphere primarily innervates the left side of the face, leading to more robust activation of on that side during . Studies using chimeric face techniques, where left and right hemifaces are combined, have demonstrated that the left hemiface conveys emotions more intensely, supporting right hemisphere dominance for both positive and negative emotions. Seminal research on patients, such as that conducted by in the 1960s and 1970s, provided early evidence for this neurological basis. In these individuals with severed , stimuli presented to the right (processed by the left ) elicited symmetrical or right-biased facial responses, whereas left stimuli (right ) produced stronger left-sided expressions, particularly for spontaneous . This is more pronounced for negative like and , where the left hemiface shows greater muscular involvement compared to the right. (EMG) studies have quantified this, revealing higher amplitude in zygomaticus and corrugator muscles on the left side during emotional tasks. These asymmetries have implications for distinguishing genuine from posed expressions. Genuine , driven by subcortical pathways, exhibit greater left-sided intensity, as seen in EMG recordings where spontaneous smiles show significant left oral asymmetries, unlike more symmetrical posed smiles. For instance, in detection, posed expressions often lack the natural left bias, with quantitative measures indicating up to 20-30% greater left hemiface activation in authentic displays. Such patterns aid in forensic and psychological assessments of emotional veracity. Expressive asymmetries appear relatively consistent across diverse populations, suggesting a biological foundation, though subtle cultural variations exist. For example, while Western samples typically show a left hemiface , some East Asian groups display reduced or reversed asymmetries in certain positive expressions, possibly influenced by . Overall, the phenomenon has been observed in studies from , , and , underscoring its prevalence.

Perception and Recognition

Neural Pathways in Face Processing

The processing of facial expressions in the human brain involves distinct neural pathways that handle different aspects of visual information, primarily through the ventral and dorsal streams of the . The ventral stream, often referred to as the "what" pathway, is crucial for configural of faces and expressions, integrating holistic features such as spatial relationships between facial components to recognize identities and emotional states. Within this stream, the (FFA), located in the lateral , plays a specialized role in encoding invariant aspects of faces, including subtle expressive cues that distinguish emotions like or from neutral configurations. In contrast, the dorsal stream, or "where/how" pathway, supports the of dynamic changes in facial expressions, such as transient movements during emotional displays, by motion and temporal sequences in regions like the (STS) and posterior parietal areas. This division allows for efficient segregation of static form from kinetic elements, enhancing the of evolving expressions in social contexts. A key component in the rapid detection of emotional salience in facial expressions is the , which facilitates quick assessment through a subcortical pathway that bypasses primary cortical processing. This low-road route, involving projections from the directly to the , enables near-instantaneous responses to fear expressions, as demonstrated in models where visual input triggers defensive behaviors within milliseconds. Joseph LeDoux's 1996 framework highlights how this pathway prioritizes survival-relevant stimuli, such as wide-eyed fear faces, allowing emotional appraisal before full conscious perception. Functional imaging confirms activation for unseen or masked fearful expressions, underscoring its role in pre-attentive emotional vigilance. The system further contributes to the interpretation of facial expressions by linking observation to internal simulation, particularly in the and . These neurons activate both when individuals produce an expression and when they observe it in others, fostering through motor resonance and facilitating the of emotional displays. This system supports social understanding by mapping observed expressions onto one's own emotional repertoire, as evidenced by enhanced premotor activity during the viewing of congruent facial actions. Neuroimaging studies, particularly using functional magnetic resonance imaging (fMRI), have revealed distinct activation patterns for basic emotions in facial expressions, providing empirical support for specialized neural coding. For instance, the insula shows robust activation in response to disgust expressions, integrating visceral sensations with observed facial cues to evoke shared affective states. Broader fMRI evidence indicates category-specific responses across emotions—such as amygdala for fear, orbitofrontal cortex for anger, and temporal regions for happiness—enabling differential decoding of expressive intent. Recent advancements in the 2020s have extended this to real-time decoding techniques, where machine learning applied to fMRI or EEG data achieves high-accuracy classification of dynamic expressions, identifying regions like the insula for broad emotional and conversational decoding in naturalistic settings. These methods leverage multivariate pattern analysis to reconstruct expressive trajectories, advancing applications in affective neuroscience.

Influence of Gender and Contextual Cues

Research indicates that influences both the and of expressions, with women often displaying more overt emotional signals compared to men. A large-scale analysis of facial behaviors during video viewing revealed that women exhibited smiles in 26% of instances versus 19.7% for men, and inner brow raises (associated with positive or sad expressions) more frequently than men, suggesting greater overall expressivity in positive and certain negative emotions. Conversely, men showed higher frequencies of brow furrows, linked to , indicating subtler cues in other domains but stronger signals of dominance-related emotions. Perceptual biases further modulate how facial expressions are interpreted based on the expresser's gender. Observers tend to associate anger with male faces and happiness or surprise with female faces, leading to faster and more accurate recognition of congruent pairings due to both perceptual (morphological) and decisional (stereotypical) factors. For instance, sadness is more readily perceived on female faces, while anger is prioritized on male ones, reflecting entrenched gender stereotypes that amplify emotional attributions to women. These biases persist even in controlled tasks, where female faces are rated as conveying more intense emotions overall. Contextual cues, such as surrounding scenes or , significantly alter the interpretation of ambiguous facial expressions. In studies replicating the , neutral faces were rated as more fearful when preceded by fear-inducing contexts (e.g., a threatening scene) compared to neutral ones, with increased ratings by 0.79 units and decreased by 0.93 units. Similarly, congruent body postures enhance neural responses to matching facial emotions, as evidenced by amplified N170 event-related potentials for aligned faces and fearful bodies. The same expression can thus shift from to depending on situational factors like background or supportive gestures. Evolutionary theories propose that these gender patterns stem from adaptive roles in signaling vulnerability or dominance. Women's enhanced expressivity and sensitivity to emotions may derive from historical child-rearing responsibilities, promoting attachment through rapid decoding of infant cues and protecting against threats, as supported by faster recognition of both positive and negative expressions. Men, in contrast, may signal dominance via anger displays to deter rivals, aligning with reproductive strategies that emphasize status over relational bonding. Empirical studies using eye-tracking highlight gendered gaze preferences that underpin these perceptual differences. Women direct more fixations and dwell time to the eyes during expression , correlating with higher accuracy and speed, which explains their advantage in decoding subtle cues. Recent research extending to gender diversity shows that transgender faces elicit distinct emotional attributions: transgender male faces are perceived as angrier and less fearful than cisgender male faces, while the reverse holds for female counterparts, indicating how non-binary and transgender identities introduce variability in expression interpretation beyond binary norms.

Communicative Roles

Functions in Nonverbal Signaling

Facial expressions serve as a primary form of , conveying emotional states and intentions to facilitate social interactions among humans. Charles Darwin's seminal 1872 work, The Expression of the Emotions in Man and Animals, laid the foundational understanding by proposing that these expressions evolved through mechanisms such as serviceable associated habits—where originally useful actions become habitual signals—and the principle of antithesis, where opposing emotions produce contrasting movements to enhance clarity in signaling. Darwin argued that expressions like raised eyebrows in or frowning in displeasure originated from physiological needs, such as rapid environmental scanning or eye protection, and function to communicate internal states across species and cultures, promoting survival through social coordination. Building on Darwin's ideas, psychologist identified six basic emotions—happiness, , , , , and —each associated with distinct facial expressions that signal specific adaptive functions in nonverbal exchanges. Happiness, marked by a involving the zygomaticus major and often the orbicularis oculi muscles, signals affiliation and positive social intent, encouraging reciprocity and bonding. Sadness, characterized by downturned mouth corners and oblique eyebrows, communicates loss or vulnerability, eliciting and support from others. Fear, with widened eyes and raised eyebrows, warns of potential danger, prompting collective vigilance or flight responses. Anger, featuring furrowed brows and tightened s, indicates a perceived or , asserting dominance or preparing for . Surprise, involving raised eyebrows and an open mouth, highlights novelty, directing to unexpected events. Disgust, displayed through a wrinkled and upper lip raise, signals aversion to contaminants or moral offenses, warning others to avoid harm. These expressions are rapid and automatic, allowing efficient transmission of emotional information without verbal cues. Individuals often regulate facial expressions to align with social norms, a process governed by that modify raw displays for contextual appropriateness. Cultural display rules, as conceptualized by David Matsumoto, include masking—replacing a true with a false one, such as smiling during to maintain ; intensification—amplifying an expression, like exaggerating sadness at a to show respect; and neutralization—suppressing emotions to a blank face, exemplified by the "" in high-stakes negotiations to conceal intentions. These strategies develop early in life and vary by culture, with collectivist societies more likely to emphasize masking for group cohesion, while individualist ones permit greater intensification of positive emotions. Such regulation prevents social friction but can complicate accurate interpretation if not aligned with shared norms. The adaptive value of facial expressions lies in their role in enhancing , including , deception detection, and rapport-building. Smiles, particularly genuine Duchenne smiles, act as costly signals of trustworthiness, occurring 2-3 times per minute in interactions and promoting by fostering affiliation and resource sharing in long-term relationships. In competitive scenarios, regulated expressions aid by concealing true feelings, though familiar observers detect inconsistencies more effectively, providing an evolutionary check against manipulation. Overall, these signals build by conveying and sociability—women, for instance, smile more frequently and decode expressions better, enhancing group cohesion and positive perceptions of or . By linking emotional states to interactive outcomes, facial expressions support evolutionary fitness through improved social navigation.

Integration with Eye Contact

Facial expressions often gain enhanced communicative power through integration with , where direction and mutual engagement modulate the interpretation of emotional signals. Direct paired with a , for instance, conveys and , intensifying the positive of the expression, while averted accompanying a signals or submission, reducing perceived emotional intensity. This synergy allows for nuanced social signaling, as directs to the face and amplifies the observer's emotional response to the expression. Cultural norms significantly influence how eye contact interacts with facial expressions. In Western cultures, prolonged eye contact with a neutral or serious expression is often interpreted as a display of dominance or , reinforcing in interactions. Conversely, in East Asian cultures, direct paired with neutral expressions is often interpreted more negatively, such as angrier or more unapproachable, potentially causing discomfort or signaling , while indirect eye contact is preferred to maintain and show . These differences highlight how gaze regulation adapts facial expressions to context-specific social expectations. Neurologically, the (STS) plays a central role in integrating and facial expression cues to facilitate social inference. The posterior STS region processes dynamic facial features, such as and emotional expressions, enabling the to interpret combined signals for understanding others' intentions and . Activation in this area increases when is directed toward the observer alongside expressive faces, supporting rapid . A foundational framework for this integration is provided by and Dean's 1965 equilibrium theory, which posits that individuals maintain a balanced level of intimacy in interactions by regulating and proximity in relation to expressions and other nonverbal cues. According to the , high —signaled by smiling and direct —prompts closer physical distance, while discomfort from excessive leads to aversion to restore . This model underscores how dynamically adjusts to expressions to manage social intimacy without overwhelming the interactants.

Role in Sign Languages

In sign languages, facial expressions function as essential non-manual markers that convey grammatical and semantic information, distinguishing them from mere emotional displays. These markers include specific configurations of the eyebrows, eyes, , and head, which are temporally aligned with manual signs to form complete utterances. For instance, in (ASL), raised eyebrows signal yes/no questions or conditional clauses, while a head shake combined with a furrowed brow or pursed lips indicates . Such expressions are obligatory elements of , ensuring grammaticality, unlike in spoken languages where prosodic intonation is optional for emphasis. Mouth shapes, or mouth morphemes, further integrate into grammar by modifying the intensity, manner, or size associated with manual signs, often serving roles. In ASL, puffed cheeks (often denoted as "oo") adverbially intensify signs to mean "very large" or "to an amazing degree," as in signing "BIG" with puffed cheeks to emphasize enormity. Similarly, protrusion ( "th") conveys sloppiness or carelessness, such as in "WRITE/th" to describe writing messily. These non-manual features are not borrowed from mouthing but are native grammatical components, with restrictions on their with certain manual signs. In (BSL), head tilts paired with neutral facial expressions mark topics, shifting focus within discourse, while eye gaze and cheek puffs distinguish lexical items or manners like "relaxed." Linguistic research highlights the prosodic nature of these facial markers across sign languages. Ronnie Wilbur's studies in the 1980s and 1990s demonstrated how non-manuals structure prosody in ASL, such as brow raises aligning with syntactic boundaries and varying with signing speed to maintain grammatical timing. Her work extended to comparative analyses, showing similar patterns in other languages like BSL and Langue des Signes Française (LSF), where furrowed brows mark wh-questions and head nods reinforce affirmations. These markers differ fundamentally from spoken language prosody by being visually obligatory and spatially precise, enabling signers to embed multiple layers of meaning without additional words.

Universality and Cultural Dimensions

Evidence Supporting Universality

One of the foundational lines of evidence for the universality of facial expressions comes from Paul Ekman's in the 1960s, particularly his fieldwork with the isolated in . These participants, who had limited exposure to external media or Western influences, demonstrated high accuracy in recognizing posed facial expressions of basic emotions such as , , , , , and , with rates typically ranging from 70% to 90% in forced-choice tasks where they matched expressions to emotional scenarios or labels. Similar results were observed when Fore individuals produced their own expressions in response to stories, which were then accurately identified by American observers, supporting bidirectional recognition across isolated and literate groups. Subsequent methodologies, including forced-choice tasks (where participants select from predefined emotion labels) and free-labeling experiments (where they describe expressions openly), have reinforced these findings. Meta-analyses of over 90 studies involving diverse populations confirm consistent recognition above chance levels, with achieving near-ceiling accuracy (often exceeding 90%) and showing robust consistency (around 70-80%) across both urban and remote settings. These approaches minimize linguistic biases and highlight that emotional signals, like the raised cheeks and crow's feet wrinkles in or the nose wrinkle in , are interpreted similarly regardless of cultural background. Neuroscientific evidence further bolsters universality through (fMRI) studies showing comparable activation in response to emotional faces across cultural groups. For instance, both Asian and European participants exhibit bilateral engagement when viewing happy and fearful expressions, indicating a shared neural for processing these signals independent of cultural origin. This subcortical response, often rapid and automatic, aligns with the behavioral patterns observed in tasks. Recent replications from the and extend this support to both and remote populations, with studies showing accuracies for basic emotions often above chance and up to 70-85% in ecologically distinct contexts, including Cowen et al. (2021) identifying 16 facial expressions occurring in similar contexts worldwide. Additionally, models trained on diverse datasets have validated universal patterns, with deep neural networks showing high correlation (r = 0.80) in decoding emotion meanings from facial movements across six countries (, Ethiopia, , , , ), underscoring consistent perceptual structures.

Cultural Variations and Criticisms

Cultural refer to socially learned norms that govern how individuals manage and modify their facial expressions in specific contexts, allowing for both universal emotional signals and -specific modifications. introduced this concept to explain variations in emotional display, such as the suppression or intensification of expressions based on social expectations. For instance, in many Western cultures, individuals openly display negative emotions like or in public, whereas in , negative expressions are often masked or subdued in the presence of authority figures or during social interactions to maintain . These rules can involve de-intensifying, neutralizing, or masking emotions, leading to observable differences in facial behavior across cultures without altering the underlying emotional experience. Cultural differences also extend to the interpretation of facial expressions, where the same expression may convey varying meanings depending on contextual and cultural norms. Studies by Scherer and Harald Wallbott in the demonstrated that while basic patterns of emotional response show some universality, the appraisal and labeling of expressions differ significantly across cultures, influencing how emotions are perceived. For example, a in East Asian contexts often signals or social rather than genuine , as seen in comparisons between and American participants, whereas in Western cultures, it more consistently denotes or . These variances arise from culture-specific event interpretations and social scripts, affecting the decoding of expressions in real-world scenarios. Criticisms of the universality hypothesis, which posits consistent recognition of basic emotions across cultures, highlight an overemphasis on discrete basic emotions while neglecting blended or context-dependent expressions. Methodological biases, such as reliance on posed expressions and samples from , Educated, Industrialized, Rich, and Democratic () populations, limit generalizability, as these groups represent only a fraction of global diversity and may not reflect expressive norms in non- societies. Recent ethnographic research in the 2020s on indigenous groups, including small-scale societies like the Maniq of , reveals further variations; for instance, descriptions of facial movements emphasize social functions over internal states, challenging assumptions of universal emotional categories. Such studies underscore gaps in prior work, advocating for more inclusive, naturalistic approaches to capture cultural specificities. The dialect theory of facial expressions posits that cultural variations function like dialects of a universal emotional , where core signals remain recognizable but are accented by local norms in production and recognition. Ursula and colleagues provided empirical support through experiments showing that participants from different cultures, such as and , activate distinct facial muscle combinations for the same posed s, yet these "dialects" are still partially intelligible across groups. This framework reconciles universality—evidenced by recognition rates above chance—with variations, suggesting expressions evolve as culturally tuned adaptations of innate patterns. Micro-expressions, brief involuntary flashes of lasting under half a second, are often cited as more resistant to cultural modulation due to their automatic nature, though dialect theory implies subtle interpretive differences may still arise in diverse contexts.

Evolutionary Origins and Adaptations

proposed the principle of continuity in emotional expression, positing that human facial expressions share a homologous origin with those of other animals, reflecting a shared evolutionary heritage. In his seminal 1872 work, The Expression of the Emotions in Man and Animals, argued that specific expressions, such as the baring of teeth in response to , serve as precursors to similar displays in , where they function as threat signals during agonistic encounters. This continuity underscores how emotional behaviors evolved gradually across species, with human expressions retaining vestiges of these ancient communicative forms. Facial expressions have adaptive value as honest signals of internal emotional states, enhancing by conveying reliable in interactions. For instance, the widening of eyes expressions not only signals but also physiologically improves threat detection by expanding the , a mechanism that reduces predation risk in ancestral environments. models further support this, demonstrating that such signals persist because their production incurs physiological costs—such as energy expenditure or vulnerability exposure—that deter , thereby stabilizing signaling in groups. These models predict that expressions evolve when the benefits of honest communication outweigh the risks, as seen in simulations of repeated dilemmas where genuine displays foster and reciprocity. Comparative and fossil evidence reinforces the evolutionary foundations of facial expressivity. Studies of nonhuman primates reveal a high degree of overlap—approximately 80%—in the basic facial expressions for emotions like , , and play between humans and species such as chimpanzees, indicating conserved neural and muscular substrates from a common ancestor. Fossil records of hominid skulls, from to Homo sapiens, document progressive changes like facial flattening and reduced , which enlarged the available for facial musculature and enabled a broader range of nuanced expressions critical for complex social bonding in expanding group sizes. These anatomical shifts, occurring rapidly between 2 million and 300,000 years ago, correlate with increased encephalization and the demands of cooperative hunting and . Recent genomic and computational advances provide modern insights into these origins. The FOXP2 gene, under positive selection in humans, regulates orofacial motor control and fine motor coordination essential for articulate movements, with variations in great apes highlighting its role in the of expressive capabilities beyond basic signals. Simulations from the 2020s, incorporating agent-based models of social , demonstrate how selection pressures for emotional signaling in group-living scenarios could generate the diversity of human expressions observed today, predicting that costly honest signals emerge stably in environments with high interdependence. These findings bridge classical theory with quantitative evolutionary dynamics, affirming the adaptive primacy of expressions in human lineage.

Developmental and Clinical Aspects

Emergence in Human Development

Facial expressions begin to emerge in infants through a series of reflexive and instinctive responses present from birth. Newborns exhibit basic distress signals such as , which serves as a primary communicative tool to elicit and support survival needs. Reflexive smiles, often brief and stimulus-driven rather than socially directed, can appear within hours of birth, typically in response to internal states like gas or satiation, marking an early precursor to more intentional expressions. These initial expressions are innate and biologically driven, with studies demonstrating that even very young infants can imitate simple adult gestures, such as protrusion or opening, as early as 12 to old, suggesting an inborn capacity for . By 6 to 8 weeks of age, these reflexive patterns evolve into social smiles, which are directed toward familiar faces, particularly during face-to-face interactions with caregivers, signaling pleasure and fostering social bonds. This transition, often termed the "2-month shift," coincides with neurophysiological maturation and increased visual tracking abilities, enabling infants to respond contingently to social stimuli. Basic emotional expressions, such as (through full smiles and cooing) and distress (intensified or frowning), become more differentiated and frequent between 3 and 6 months, as infants gain greater motor control over facial muscles and begin to associate expressions with environmental contexts. More complex emotions, including blends like or , emerge around 18 to 24 months, tied to advancing cognitive awareness and self-regulation. The development of these expressions is profoundly shaped by attachment dynamics, as outlined in John Bowlby's foundational work from the 1960s, which posits that secure caregiver-infant bonds provide a safe base for exploring and expressing emotions through facial cues. Infants in secure attachments display more varied and adaptive facial expressions during interactions, as caregivers' responsive mirroring reinforces emotional signaling and regulation. Critical periods, such as the described by between 6 and 18 months, further contribute to this progression; during this phase, infants recognize their reflected image, facilitating self-other distinction and the imitation of facial expressions observed in caregivers or mirrors, which enhances expressive repertoire and social learning. Longitudinal studies from the across diverse cultures highlight both universal patterns and subtle cultural modulations in this developmental trajectory. For instance, research tracking and Cameroonian infants from 6 to 12 weeks found that social smiling emerges universally around the 2-month mark, driven by shared biological maturation, yet its frequency and duration by 12 weeks vary with cultural interaction styles—more prolonged in independent contexts like compared to interdependent ones like —indicating early tuning to social norms. Similarly, cross-cultural analyses in the , , and other regions confirm that basic expressions like and distress follow a consistent ontogenetic sequence, while caregiver responsiveness introduces cultural specificity in expressive intensity and context-appropriate display. These findings underscore the interplay of innate universals with environmentally influenced refinement throughout early development.

Disorders Impacting Facial Expressions

Facial paralysis represents a significant category of disorders that impair the production of facial expressions through damage or dysfunction of the (cranial nerve VII). Bell's palsy, the most common cause of acute , is characterized by sudden unilateral weakness or paralysis of the , often due to inflammation or compression of the nerve. This results in drooping of the mouth, inability to close the eye on the affected side, and flattened facial features, severely limiting the ability to convey such as smiling or frowning. The annual incidence of Bell's palsy is estimated at 20 to 30 cases per 100,000 individuals, with most cases resolving spontaneously within weeks to months, though up to 30% experience persistent weakness or (involuntary muscle contractions). Moebius syndrome, in contrast, is a rare congenital disorder involving bilateral facial paralysis from underdevelopment or absence of the sixth and seventh cranial nerves, leading to a mask-like facies and complete inability to produce voluntary facial movements. Affected individuals cannot smile, frown, or purse their lips, which profoundly impacts nonverbal communication and emotional expression from birth. The incidence is approximately 1 in 50,000 live births, and it often co-occurs with limb anomalies or other cranial nerve deficits. Deficits in recognizing facial expressions are prominent in several neurodevelopmental and psychological conditions. , or face blindness, involves impaired perception of facial identity and emotions, with individuals struggling to interpret dynamic expressions like or due to disruptions in processing. Developmental prosopagnosia, present from childhood without brain injury, is linked to specific deficits in decoding emotional valence from faces, affecting social interactions. Alexithymia, a trait involving difficulty identifying and describing , is associated with global impairments in labeling static and dynamic facial expressions, particularly negative emotions such as or . This deficit persists even when visual processing is intact, suggesting underlying difficulties in emotional conceptualization rather than perceptual acuity. High scores correlate with reduced accuracy in tasks across multiple studies. Autism spectrum disorder (ASD) encompasses both reduced expressivity and recognition challenges, as outlined in diagnostic criteria, which require persistent deficits in social communication including abnormal use of facial expressions (limited, exaggerated, or atypical). Individuals with ASD often display fewer spontaneous expressions, shorter durations of emotional displays, and more ambiguous or less socially calibrated facial signals, contributing to misunderstandings in interpersonal exchanges. The specifies that these impairments must be evident across contexts and not better explained by . Prevalence of ASD is approximately 1 in 31 children aged 8 years in the United States, based on 2022 surveillance data released in 2025, with facial expressivity issues contributing to diagnostic severity levels. Treatments for these disorders target both production and impairments. For hyper-expressive conditions like , which causes involuntary facial contractions mimicking exaggerated expressions, (Botox) injections provide temporary relief by paralyzing overactive muscles, improving symmetry and reducing distress in up to 90% of cases with repeated applications every 3-6 months. For deficits in and related conditions, (VR)-based interventions have emerged in the 2020s, using immersive scenarios to train emotion identification through repeated exposure to avatar expressions, showing improvements in accuracy by 20-30% after 8-12 sessions. These therapies leverage to enhance activation, as evidenced in controlled trials.

References

  1. [1]
    [PDF] Chapter 11. Facial Expression Analysis
    Facial expressions are the facial changes in response to a person's internal emotional states, intentions, or social communications. Facial expression analysis ...
  2. [2]
    [PDF] Facial Expression and Emotion
    Cross-cultural research on facial expression and the de- velopments of methods to measure facial expression are briefly summarized.
  3. [3]
    Darwin's contributions to our understanding of emotional expressions
    While Darwin proposed that facial expressions of emotion are universal, he also proposed that gestures are culture-specific conventions. This has proven to be ...Missing: history | Show results with:history
  4. [4]
    The influence of facial expressions on social interactions - NIH
    Oct 27, 2022 · Facial expressions convey information about one's emotional state and are one of the most important forms of non-verbal communication.
  5. [5]
    Facial EMG – Investigating the Interplay of Facial Muscles ... - NCBI
    Nov 29, 2022 · The general number of facial muscles in humans is 43, although this number can vary between people (Waller et al., 2008).
  6. [6]
    Anatomy, Head and Neck: Facial Muscles - StatPearls - NCBI - NIH
    Apr 20, 2024 · The human face possesses around 30 muscles on each side, depending on how they are counted. The facial muscles are striated muscles that link ...Introduction · Structure and Function · Blood Supply and Lymphatics · Nerves
  7. [7]
    Neuroanatomy, Cranial Nerve 7 (Facial) - StatPearls - NCBI Bookshelf
    The facial nerve (CN VII) arises from the brainstem, controls facial muscles, oral glands, and the anterior two-thirds of the tongue. It has motor, sensory, ...
  8. [8]
    [PDF] Universal Facial Expressions of Emotion - Paul Ekman Group
    We agree with Darwin and Tomkins that there are distinctive move- ments of the facial muscles for each of a number of primary affect states, and these are ...
  9. [9]
    Facial Action Coding System - APA PsycNet
    The Facial Action Coding Systems (FACS; Ekman & Friesen, 1978) was derived from an analysis of the anatomical basis of facial movement.
  10. [10]
    Anatomy, Head and Neck: Orbicularis Oculi Muscle - StatPearls - NCBI
    May 25, 2024 · [4][5] The orbicularis oculi muscle inserts into the lateral palpebral raphe, the upper and lower tarsal plates, and the skin around the orbit's ...
  11. [11]
    [Table], Table 1. Facial Muscles - StatPearls - NCBI Bookshelf
    Zygomaticus major. Lateral zygomatic border. Modiolus. Dilates the mouth, raises the labial commissure bilaterally to smile or unilaterally to show disdain.
  12. [12]
    Anatomy, Head and Neck, Risorius Muscle - StatPearls - NCBI
    The risorius muscle's function is to aid in facial expression by pulling the corner of the mouth laterally via its contraction in an outward and upward motion.
  13. [13]
    Anatomy, Head and Neck; Frontalis Muscle - StatPearls - NCBI
    Generally, the frontalis inserts at the eyebrow dermis and terminates laterally at the temporal ridge, but there is some variance and occasionally may ...
  14. [14]
    Anatomy, Head and Neck: Levator Labii Superioris Muscle - NCBI
    Apr 20, 2024 · The levator labii superioris (LLS) muscle, also known as the quadratus labii, contributes to facial expression and mouth and upper lip movement.
  15. [15]
    The amygdalo-motor pathways and the control of facial expressions
    The decision to produce a facial expression emerges from the joint activity of a network of structures that include the amygdala and multiple, interconnected ...
  16. [16]
    Cortical control of facial expression - Wiley Online Library
    Sep 29, 2015 · The present Review deals with the motor control of facial expressions in humans. Facial expressions are a central part of human ...
  17. [17]
    The neurobiological basis of emotions and their connection to facial ...
    Mar 7, 2025 · Facial expressions are controlled by the facial nucleus, located in the brainstem, which sends motor signals to the muscles of the face. In ...Missing: corticobulbar Duchenne
  18. [18]
    The anatomy of a smile: how to spot a fake from the real thing
    May 29, 2025 · However, there's a key neurological difference: Duchenne smiles tend to be generated by the limbic system, the brain's emotional core ...
  19. [19]
    Lateralization of the expression of facial emotion in humans - PubMed
    Jul 2, 2018 · The right hemisphere's emotion-processing superiority results in hemifacial asymmetries in expressivity: the left hemiface is anatomically more ...Missing: studies | Show results with:studies
  20. [20]
    Right Hemisphere Dominance for the Production of Facial ... - Science
    That the right hemisphere determines facial expression, and the left hemisphere processes species-typical vocal signals, suggests that human and nonhuman ...Missing: hemiface | Show results with:hemiface
  21. [21]
    Hemispheric mechanisms controlling voluntary and ... - PubMed
    The results revealed that commands presented to the left hemisphere effecting postures of the lower facial muscles showed a marked asymmetry, with the right ...
  22. [22]
    Emotion Processing in Chimeric Faces: Hemispheric Asymmetries in ...
    May 1, 2003 · We replicated the finding that emotions are expressed more intensely in the left hemiface for all emotions and conditions, with the exception of evoked anger.
  23. [23]
    Left-sided oral asymmetries in spontaneous but not posed smiles
    Left-sided oral asymmetries in spontaneous but not posed smiles. Neuropsychologia. 1988;26(6):823-32. doi: 10.1016/0028-3932(88)90052-8. Authors. D R Wylie ...
  24. [24]
    The Influence of Facial Asymmetry on Genuineness Judgment - NIH
    In the present study, symmetrical EFEs were judged as more genuine than asymmetrical faces, with no perceptual differences between asymmetrical faces with an ...
  25. [25]
    Cultural variation in hemifacial asymmetry of emotion expressions
    Culture-specificity was, however, observed with respect to hemifacial asymmetry and valence of emotion expressions: (1) Japanese showed a right hemifacial bias ...
  26. [26]
    Differential Hemispheric Lateralization of Emotions and Related ...
    The research, in general, has shown that the left hemiface is more expressive than the right hemiface in support of the RHH [192]. A meta-analysis of 16 ...
  27. [27]
    The fusiform face area: a cortical region specialized for the ...
    We review the literature on a region of the human brain that appears to play a key role in face perception, known as the fusiform face area (FFA).Missing: stream | Show results with:stream
  28. [28]
    The Fusiform Face Area Is Engaged in Holistic, Not Parts-Based ...
    Specifically, the FFA was defined as the set of contiguous voxels in the mid-fusiform gyrus that showed significantly higher responses to faces as compared with ...
  29. [29]
    Brain Responses to Dynamic Facial Expressions: A Normative Meta ...
    Jun 4, 2018 · Since the dorsal stream processes more information about movement of faces, dynamic facial expressions should involve more activation of the ...
  30. [30]
    Task-dependent enhancement of facial expression and identity ...
    May 15, 2018 · Responses in dorsal areas increased during the expression task, whereas responses in ventral areas increased during the identity task, ...
  31. [31]
    (PDF) The Emotional Brain, Fear, and the Amygdala - ResearchGate
    The major conclusion from studies of fear conditioning is that the amygdala plays critical role in linking external stimuli to defense responses.
  32. [32]
    Rapid Processing of Invisible Fearful Faces in the Human Amygdala
    Feb 22, 2023 · Based on rodent research (LeDoux, 1996), a low-road model suggests that rapid fear detection in the amygdala is enabled through a subcortical ...
  33. [33]
    A subcortical pathway to the right amygdala mediating “unseen” fear
    A subcortical pathway to the right amygdala, via midbrain and thalamus, provides a route for processing behaviorally relevant unseen visual events.
  34. [34]
    Facial mimicry and the mirror neuron system - Frontiers
    Jul 25, 2012 · Numerous studies have shown that humans automatically react with congruent facial reactions, i.e., facial mimicry, when seeing a vis-á-vis' ...
  35. [35]
    Neural mechanisms of empathy in humans: A relay from ... - PNAS
    The superior temporal cortex would be more active during imitation than observation, as it receives efferent copies of motor commands from mirror areas (12).
  36. [36]
    Both of Us Disgusted in My Insula: The Common Neural Basis of ...
    Our core finding is that the anterior insula is activated both during the observation of disgusted facial expressions and during the emotion of disgust evoked ...Olfactory Stimulation · Visual Stimulation · Discussion
  37. [37]
    Decoding six basic emotions from brain functional connectivity ...
    Nov 11, 2022 · We collected whole-brain fMRI data while human participants viewed pictures of faces expressing one of the six basic emotions (i.e., anger, ...Article Pdf · Emotion Assessment Based On... · Author Information
  38. [38]
    Charting Decodability of Dynamic Facial Expressions in Young ... - NIH
    Using robust bootstrap analyses, we identified the insula as a common brain region able to decode the wide range of emotional and conversational dynamic facial ...Missing: 2020s | Show results with:2020s
  39. [39]
    The brain computes dynamic facial movements for emotion ... - PNAS
    Jun 17, 2025 · We present evidence for a third “social” brain pathway dedicated to processing dynamic social signals—specifically facial expressions—for ...
  40. [40]
  41. [41]
  42. [42]
  43. [43]
  44. [44]
    [PDF] The Expression of the Emotions in Man and Animals - Darwin Online
    —Special Expressions of Animals. The Dog, various expressive movements of—Cats—Horses—Ruminants. —. Monkeys, their expression ofjoy and affection—Of pain.
  45. [45]
    Universal Emotions | What are Emotions? - Paul Ekman Group
    WHAT ARE THE SIX BASIC EMOTIONS ACCORDING TO PAUL EKMAN? Dr. Ekman identified the six basic emotions as anger, surprise, disgust, enjoyment, fear, and sadness.Atlas of Emotions · What is Anger? · What is Contempt? · Books
  46. [46]
    Cultural Display Rules - Matsumoto - Wiley Online Library
    Sep 3, 2011 · Cultural display rules are cultural norms learned early in life that govern the regulation of expressive behaviors depending on social contexts.
  47. [47]
    Human Facial Expressions as Adaptations:Evolutionary Questions ...
    An evolutionary model of human facial expression as behavioral adaptation can be constructed, given the current knowledge of the phenotypic variation.
  48. [48]
    Eye contact with neutral and smiling faces: effects on autonomic ...
    Gaze direction has also an effect on perceived valence and intensity of facial expressions, and neutral expressions are interpreted as expressing approach- ...
  49. [49]
    Emotional Gaze: The Effects of Gaze Direction on the Perception of ...
    Aug 1, 2021 · Eye gaze could modulate the perception of emotion not only in neutral faces, but also for other types of facial expressions. However, in our ...
  50. [50]
    Attention to Eye Contact in the West and East - Research journals
    In Western cultures, eye contact during social interaction is considered more important than in East Asian cultures. In a study investigating the importance of ...
  51. [51]
    5 Communication Differences Between Eastern And Western Cultures
    Feb 4, 2021 · Maintaining eye contact means confidence and paying attention in Western cultures. In Asian cultures that is a sign of aggression and rudeness.
  52. [52]
    Neural Responses to Expression and Gaze in the Posterior Superior ...
    One pathway (including posterior superior temporal sulcus, pSTS) is responsible for processing changeable aspects of faces such as gaze and expression, and the ...Missing: integration | Show results with:integration
  53. [53]
    Review Social perception from visual cues: role of the STS region
    The STS region is activated by movements of the eyes, mouth, hands and body, suggesting that it is involved in analysis of biological motion.
  54. [54]
    [PDF] Eye-Contact, Distance and Affiliation - Michael Argyle; Janet Dean
    Mar 31, 2003 · Experiments are reported which suggest that people move towards an equilibrium distance, and adopt a particular level of eye-contact. As ...
  55. [55]
    Facial Expressions, Emotions, and Sign Languages - PMC - NIH
    Mar 11, 2013 · Facial expressions are used by humans to convey various types of meaning in various contexts. The range of meanings spans basic possibly ...
  56. [56]
    None
    ### Summary of Mouth Morphemes in ASL from the Article
  57. [57]
    (PDF) Nonmanual Structures in Sign Language - ResearchGate
    Non-manual articulations are a fundamental component of all sign languages. They include not only various aspects of facial expression but also eye gaze.
  58. [58]
  59. [59]
  60. [60]
  61. [61]
    Culture but not gender modulates amygdala activation during ...
    May 29, 2012 · Emotional expressions (i.e., happy and fearful) have rarely been used in fMRI studies addressing culture effects, but play a special role in ...
  62. [62]
    Recognizing Spontaneous Facial Expressions of Emotion in a Small ...
    We report 2 studies on how residents of Papua New Guinea interpret facial expressions produced spontaneously by other residents of Papua New Guinea.
  63. [63]
    Deep learning reveals what facial expressions mean to people in ...
    Feb 10, 2024 · A deep neural network tasked with predicting the culture-specific meanings people attributed to facial movements while ignoring physical appearance and context
  64. [64]
    [PDF] Universals and Cultural Differences in Facial Expressions of Emotion
    Cultural differences in facial expression occur (a) because most of the events which through learning become established as the elicitors of particular emotions.
  65. [65]
    Facial expressions of emotion are not culturally universal - PNAS
    The universality hypothesis claims that all humans communicate six basic internal emotional states (happy, surprise, fear, disgust, anger, and sad) using the ...Missing: replications | Show results with:replications
  66. [66]
    A brief history of emotion expression research in small-scale societies
    Aug 5, 2025 · Empirically, most research is still conducted in WEIRD populations, with a smaller mode of research in small-scale societies, and very little in ...
  67. [67]
    Culture shapes how we describe facial expressions | Scientific Reports
    Sep 16, 2024 · They depicted complex facial expressions corresponding to a set of specific emotions: surprise, disgust, anger, sadness, joy, fear, pride, ...
  68. [68]
    [PDF] Toward a Dialect Theory: Cultural Differences in the Expression and ...
    The dialect theory of communicating emotion argues that cul- tural differences in the expression of emotion are like dialects of a universal language. ...
  69. [69]
    The Expression of the Emotions in Man and Animals: Darwin's ...
    Sep 11, 2023 · Darwin's main objectives were to show how human facial expressions constitute a shared heritage of our species, have parallels with the ...
  70. [70]
    Facial expressions as honest signals of cooperative intent in a one ...
    The aim of the current study was to examine the association between facial expressions and cooperation among dyads engaged in a one-shot social dilemma.
  71. [71]
    [PDF] The value of a smile: Game theory with a human face
    Our principal conjecture is that smiling facial expressions act like a signal intended to induce a cooperative move.15 In the analysis that follows, we.
  72. [72]
    Understanding chimpanzee facial expression: insights into the ... - NIH
    Chimpanzees display a complex, flexible facial expression repertoire with many physical and functional similarities to humans.
  73. [73]
    Facial Expression Categorization by Chimpanzees Using ... - NIH
    To understand the evolutionary significance of primate facial expressions, comparative studies must examine in detail how similar these expression are to human ...
  74. [74]
  75. [75]
    FOXP2 variation in great ape populations offers insight into ... - Nature
    Dec 4, 2017 · FOXP2 is the first gene that was discovered to be associated with language disorders and fine orofacial motor control, as two functional copies ...
  76. [76]
    Evolutionary Origin of Human Facial Expressions - SSRN
    Jun 4, 2025 · This paper presents an evolutionary explanation for the white sclera of the eyes and, in particular, for subtle human facial expressions.Missing: simulations 2020s<|control11|><|separator|>
  77. [77]
    [PDF] Infant Social and Emotional Development - MassAIMH
    Feb 28, 2020 · Positive emotion expressions (e.g., smiles) typi- cally emerge by age 2–3 months, with laugh- ter often apparent by age 3–4 months. More complex ...
  78. [78]
    When Will My Baby Smile? - Pathways.org
    Jul 18, 2025 · Typically, babies start smiling between 6 and 8 weeks old, but you may notice a reflex smile or smirk soon after Baby's born.
  79. [79]
    Babies' Development from 6 to 8 Weeks: Changes, Challenges, and ...
    Mar 17, 2017 · For most babies, social smiles appear around 6 to 8 weeks of age. These smiles are all the more welcome because babies' crying also peaks around ...
  80. [80]
    None
    ### Key Findings on Development of Social Smiles in Infants Across Cultures
  81. [81]
    [PDF] Early attachment predicts emotion recognition at 6 and 11 years old
    Magai here is restating in attachment terms the working hypothesis of Tomkins from the early 1960s: ''if parents unduly punish the facial expression of emotion ...
  82. [82]
    Lacan's Concept of Mirror Stage - Literary Theory and Criticism
    Apr 22, 2016 · Lacan expounds the concept of the mirror stage that occurs between 6-18 months of a child's development, when the child begins to draw rudimentary distinction ...
  83. [83]
    Concern and comforting in 9- and 18-month-old infants from Uganda ...
    Using a cross-cultural longitudinal design, this study investigated how infants at 9 and 18 months sampled from Uganda (N = 44, 24 female) and the UK (N ...
  84. [84]
    Bell Palsy - StatPearls - NCBI Bookshelf
    Oct 6, 2024 · The facial asymmetry accompanying Bell palsy may cause dysarthria, oral incompetence, and difficulty expressing emotions nonverbally, resulting ...
  85. [85]
    Moebius Syndrome - Symptoms, Causes, Treatment | NORD
    Moebius syndrome is a rare neurological disorder characterized by weakness or paralysis (palsy) of multiple cranial nerves.
  86. [86]
    Impaired perception of facial emotion in developmental prosopagnosia
    Developmental prosopagnosia (DP) is a neurodevelopmental condition characterised by difficulties recognising faces. Despite severe difficulties recognising ...
  87. [87]
    Alexithymia and facial expression recognition: A systematic review ...
    Jul 18, 2025 · Conclusions: Results indicate alexithymia is associated with a global deficit labelling static facial expressions, that does not appear to be ...
  88. [88]
    Botulinum Toxin for the Treatment of Hemifacial Spasm - MDPI
    Dec 9, 2021 · Intramuscular injections of Botulinum Toxin (BoNT) are routinely used as HFS treatment. Methods: We reviewed published articles between 1991 and ...Missing: hyper- expressive
  89. [89]
    Trajectories of Emotion Recognition Training in Virtual Reality and ...
    Apr 14, 2023 · Virtual reality (VR) could be a promising tool for delivering SCT. Presently, it is unknown how improvements in emotion recognition develop ...