Back

Research Identifies Integrated Cortical Network for Emotional and Voluntary Facial Expressions

Show me the source
Generated on:

A new study has identified an integrated cortical network that governs both emotion-driven and voluntary facial expressions in nonhuman primates. The findings challenge previous assumptions that distinct neural pathways control these different types of facial movements, suggesting instead that a shared sensorimotor network orchestrates facial actions with varying neural dynamics across regions.

Study Overview and Methodology

Researchers led by Winrich Freiwald at Rockefeller University’s Laboratory of Neural Systems investigated the neural mechanisms underlying facial expressions. The study, published in Science (and referenced as also appearing in PNAS in one account), aimed to determine the extent to which primate facial movements are reflexive or involve cortical pathways.

The research involved two macaque monkeys, utilizing functional MRI (fMRI) and electrophysiological recordings to measure neuronal activity. The macaques performed two types of tasks:

  • Voluntary movements: Such as chewing food.
  • Emotion-driven movements: Exhibiting threatening expressions or lipsmacking (a cooperative gesture) in response to images of other monkeys.

The study monitored four specific brain areas: the primary motor cortex, ventral premotor cortex, somatosensory cortex, and cingulate motor cortex.

Integrated Cortical Pathways

Contrary to earlier hypotheses that emotional expressions and voluntary actions were regulated by segregated regions of the frontal lobe, the study observed that all four monitored brain areas contained neurons that fired during chewing, grimacing, and lipsmacking. This suggests that emotion-driven facial expressions engage similar cortical pathways as voluntary movements.

Key observations included:

  • Each area contained a mixture of broadly tuned "gesture-general" neurons and narrowly tuned "gesture-specific" neurons.
  • While each of the three gestures produced distinct patterns of neuronal population activity, the intermixing of signals for both emotion-driven and voluntary actions suggests a more distributed cortical coding of emotionally driven behavior than previously understood.
  • The cortical regions governing facial movement were found to function as a single interconnected sensorimotor network, which adjusts its coordination based on the specific movement being produced.

Distinct Neural Dynamics and Hierarchical Coding

The research also revealed that these cortical regions encode facial movement information across different timescales, contributing to a hierarchical coding system:

  • Lateral motor and somatosensory cortices: Exhibited dynamic activity during movement onset, consistent with real-time control of movement.
  • Medial cingulate cortex: Demonstrated more stable activity before and throughout the movement.
  • Premotor cortex: Showed moderately stable activity at an intermediate timescale.

This hierarchical organization suggests that information may flow from the cingulate cortex to the primary motor and somatosensory cortices, which then relay signals to the facial muscles for real-time adjustments. Neurons active before the movement occurs, primarily concentrated in the cingulate cortex but present across all areas, may represent an animal's internal state or integrate top-down contextual information for both emotional and voluntary gestures.

Implications and Future Directions

The presence of movement-independent cells suggests that emotion-driven movements may involve a degree of voluntary control not previously acknowledged. These findings align with prior work by Freiwald’s team, which identified high interconnectivity between areas encoding voluntary and emotion-driven facial movements in macaques, adding to evidence that cortical areas influence emotions.

Michael Platt, a professor of psychology at the University of Pennsylvania, noted that while similar signals may be present across different cortical regions, these regions might perform distinct computations with the same information. Sebastian Korb, a senior lecturer in psychology at the University of Essex, provided a counter-example, pointing out that individuals with lesions in lateral cortical areas sometimes retain the ability to smile voluntarily on both sides of the face despite paralysis on one side, which could suggest some functional specificity. Freiwald concluded that the findings indicate an interconnected and mutually influential relationship between emotions and cognition, where awareness of one's situation shapes both thought and feeling.

Future research aims to simultaneously study facial perception and expression to further enhance the understanding of emotions. Potential applications of this work include improving brain-machine interfaces, particularly for decoding and translating facial signals for communication in patients with brain injuries, and furthering the understanding of how social cues, internal states, and reward mechanisms influence the facial motor system.