Vocalizations including laughter, cries, moans, or screams constitute a potent source of information about the affective states of others. It is typically conjectured that the higher the intensity of the expressed emotion, the better the classification of affective information. However, attempts to map the relation between affective intensity and inferred meaning are controversial. Based on a newly developed stimulus database of carefully validated non-speech expressions ranging across the entire intensity spectrum from low to peak, we show that the intuition is false. Based on three experiments (Nā=ā90), we demonstrate that intensity in fact has a paradoxical role. Participants were asked to rate and classify the authenticity, intensity and emotion, as well as valence and arousal of the wide range of vocalizations. Listeners are clearly able to infer expressed intensity and arousal; in contrast, and surprisingly, emotion category and valence have a perceptual sweet spot: moderate and strong emotions are clearly categorized, but peak emotions are maximally ambiguous. This finding, which converges with related observations from visual experiments, raises interesting theoretical challenges for the emotion communication literature.
You may also like
New Research Shows How Cultural Transmission Shapes...
March 27, 2023Max Planck Institute for Empirical Aesthetics
Amygdala Intercalated Cells: Gatekeepers and Conveyors...
January 19, 2023Max Planck Florida Institute for Neuroscience
Aversive bimodal associations differently impact...
January 19, 2023Max Planck Institute for Chemical Ecology
Commonalities and Asymmetries in the Neurobiological...
January 5, 2023Max Planck Institute for Psycholinguistics
A hierarchy of linguistic predictions during natural...
January 5, 2023Max Planck Institute for Psycholinguistics
Language as a Marker of the Mind
October 6, 2022Max Planck Institute for Psycholinguistics