Every day you hear at least some utterances you’ve never heard before. That you can understand them is partly due to the fact that they are structured according to grammatical rules. Scientists have found that the human brain may use the relative timing of brainwaves to encode and decode the structures in a sentence.
Grammar is a way of structuring information that makes language an efficient way to communicate. Knowing the grammatical rules of our language allows us to say pretty much anything we want, including things we have never heard before by combining words to (new) sentences. Being able to learn and use grammar is unique to humans. But it also creates a challenge for the science of how the brain processes human language — how do our brains, essentially a bunch of cells in a network, represent something as abstract as grammatical rules?
Scientists at the University of Edinburgh and the Max Planck Institute for Psycholinguistics study this question with the help of computer-based models. They constructed an artificial neural network that simulates key features of the brain, such as densely connected populations of neurons that show neural oscillations. Neural oscillations are wave-like patterns of activity that happen at different frequencies, some very fast and some slow. The relative timing of these neural oscillations can help the brain encode grammatical relationships between words in a sentence, as Andrea Martin and Leonidas Doumas report in a paper in PLOS Biology.
Relative timing of brainwaves encodes the structure of a sentence
By encoding words in one oscillation, and phrases in another, the brain can keep track of words and phrases at the same time. This demonstrates how something as complex as a sentence can be encoded in the neural currency of oscillations. A key finding of the new study is that these artificial neural networks, when fed example sentences, give off patterns of energy that mimic what the brain does when it processes a sentence (see Figure below). Martin, lead author of the study, says: “This work helps us understand how the brain solves a complex puzzle and why it gives off the activity patterns that it does when processing language.”
In this exciting age of the brain, where we know more about our brains than ever before, being able to link basic experiences like speaking and understanding language directly to brain function is especially important. Linking our brains to our behaviors holds the key to understanding not only what it means to be human, but also to understanding how the (arguably) most complex computing device in the universe, the human brain, gives rise to our daily experiences. Such knowledge may also lead to biologically inspired advances in human-like artificial intelligence and computation.
Martin A.E. & Doumas L.A.A. (2017). A mechanism for the cortical computation of hierarchical linguistic structure. PLOS Biology 15(3): e2000663.