Evidence suggests that temporal predictions arising from the motor system can enhance auditory perception. However, in speech perception, we lack evidence of perception being modulated by production. Here we show a behavioural protocol that captures the existence of such auditory–motor interactions. Participants performed a syllable discrimination task immediately after producing periodic syllable sequences. Two speech rates were explored: a ‘natural’ (individually preferred) and a fixed ‘non-natural’ (2 Hz) rate. Using a decoding approach, we show that perceptual performance is modulated by the stimulus phase determined by a participant’s own motor rhythm. Remarkably, for ‘natural’ and ‘non-natural’ rates, this finding is restricted to a subgroup of the population with quantifiable auditory–motor coupling. The observed pattern is compatible with a neural model assuming a bidirectional interaction of auditory and speech motor cortices. Crucially, the model matches the experimental results only if it incorporates individual differences in the strength of the auditory–motor connection.
You may also like
Distributed workload in the fly brain
October 9, 2023Max Planck Institute for Biological Intelligence
Vision in the brain – hardwired for action
October 3, 2023Max Planck Institute for Biological Intelligence
Liraglutide benefits brain activity in people with...
October 3, 2023Max Planck Institute for Metabolism Research
Poetic birdsong, precisely tuned
July 24, 2023Max Planck Institute for Biological Intelligence
Deep learning models to study sentence comprehension...
June 28, 2023Max Planck Institute for Psycholinguistics
Oh, That’s Nice
June 28, 2023Max Planck Institute for Empirical Aesthetics