A paper published recently in Nature Communications details how a team led by Dr. Ben Wilson and Professor Chris Petkov used a brain imaging technique to identify the neuronal evolutionary origins of language. Their findings help us understand how we learn to speak, and could allow new treatments for those who lost this ability from aphasia following a stroke or dementia.
By scanning the brains of macaque monkeys, the researchers identified an area in the front of the brain that, in both humans and macaques, recognizes a sequence of sounds as speech, and is responsible for analyzing if the sounds are in legal order or in an unexpected, illegal order.
“Young children learn the rules of language as they develop, even before they are able to produce language. So, we used a ‘made up’ language first developed to study infants, which our lab has shown the monkeys can also learn. We then determined how the human and monkey brain evaluates the sequences of sounds from this made up language,” said Professor Petkov.
Human and monkey subjects were played an example sequence from the made-up language, to hear the correct order in the sequence of sounds. After this they were played new sequences, some of which were in an incorrect order, and the team scanned their brains using fMRI. In both species, there was neuronal response in the same region of the brain — the ventral frontal and opercular cortex — when the sounds were correctly ordered.
The findings suggest that this region’s functionality is shared between humans and macaques, revealing a common cerebral evolutionary source. This brain region seems to monitor the orderliness or organization of sounds and words, which is an important cognitive function, at the core of the more complex language abilities of humans. The findings are the first scientific evidence that other animals share with us at least some of the functions this area serves, which include understanding language in humans.
“Identifying this similarity between the monkey and human brain is also key to understanding the brain regions that support language but are not unique to us and can be studied in animal models using state-of-the-art neuroscientific technologies,” Professor Petkov explains.
“This will help us answer questions on how we learn language and on what goes wrong when we lose language, for example after a brain injury, stroke or dementia.”
Building on these developments, the Newcastle University team, with their neurology collaborators in Cambridge and Reading Universities have begun a project to study the function of this brain region and its role in language impairment in aphasic patients with stroke, which might lead to better diagnosis and prognosis of language impairment.