Abstract
Background
Embodied models of social cognition argue that others’ affective states are processed by re‐enacting a sensory‐specific representation of the same state in the observer. However, neuroimaging studies suggest that a reliable part of the representation shared between self and others is supramodal, and relates to dimensions such as unpleasantness or arousal, common to qualitatively‐different experiences. Here we investigated whether representations of first‐hand pain and disgust influenced the subsequent evaluation of facial expressions in modality‐specific fashion, or in terms of unpleasantness or arousal.
Methods
30 volunteers were subjected to thermal painful and olfactory disgusting events, and subsequently were asked to classify computer‐generated faces expressing pain (characterized by high unpleasantness and arousal), disgust (high unpleasantness and low arousal), surprise (low unpleasantness and high arousal), and hybrid combinations thereof.
Results
Thermal and olfactory events were associated with comparable unpleasantness ratings and heart rate (but stronger galvanic response was found for painful temperatures). Furthermore, we found that the appraisal of facial expressions was biased by the prior stimulus, with more frequent pain classifications following thermal stimuli, and more frequent disgust classifications following olfactory stimuli. Critically, this modulation was cross‐modal in nature, as each first‐hand stimulation influenced in comparable fashion facial traits diagnostic of both pain and disgust, without instead generalizing to features of surprise.
Conclusion
Overall, these data support the presence of shared coding between one's aversive experiences and the appraisal of others’ facial responses, which is best describable as supramodal representation of the unpleasantness of the experience.
This article is protected by copyright. All rights reserved.
from Wiley: European Journal of Pain: Table of Contents https://ift.tt/2TKuE4p
via IFTTT
No comments:
Post a Comment