Everyone likes a good story. Now, scientists can finally see why.
By monitoring the neural activity of human subjects while listening to hour after hour of stories, researchers have captured – in vivid detail and in real time – the way the meaning of language is organized across the entire brain.
The result could prove useful for understanding how people can bounce back from strokes or other brain-related injuries that can affect language. But more broadly, it shows just how thoroughly our capacity to understand speech engages our mental processes in virtually every region of the brain.
“Nobody has actually done this before,” said Alex Huth, a postdoctoral researcher at the University of California, Berkeley, and lead author of what he and his colleagues describe as a “semantic map” of the human cerebral cortex. “We’re showing where all these different concepts are represented.”
The effort, published Wednesday in the journal Nature, might be compared to getting a first look at Earth from space. While it’s not a surprise to see continents and oceans, the richness and variety of the vistas are unexpectedly mesmerizing.
Previous studies have already shown that certain brain areas light up in response to particular words or ideas. The new map goes much further by showing the fine-grained mosaic of activity within these regions and also by showing that many words are associated with multiple regions.
To create the map, the researchers used reams of data drawn from seven individuals who each spent about six hours listening to real-life stories from The Moth Radio Hour, a public radio program. Their brain responses to the stories were monitored using functional magnetic resonance imaging (fMRI), a technique that subdivides the brain into many thousands of tiny compartments, or “voxels,” and records blood oxygen levels in each one. The levels can be used as a proxy for how hard the brain cells located within each voxel are working.
Dr. Huth noted that the stories were essential for the semantic mapping experiment because simply testing subjects with lists of individual words or phrases would not elicit nearly the same level of brain activity.
“If you want to get strong responses to language, you need to have language that says something the person cares about hearing,” he said.
All told, the experiment involved the subjects listening to more than 10,000 individual words that researchers classified under 12 broad categories such as “social,” “locational,” “professional” and “violent.”
Using the categories, they were then able to identify at least 100 separate brain regions where activity related to those categories appeared to cluster. These regions were remarkably consistent between individuals, though the researchers caution that individuals living in very different cultures and settings may well show a different semantic organization.
The pattern is in keeping with an emerging paradigm in cognitive research that no longer sees the brain as being built out of discrete modules, but more as a vast interconnected network where each area plays multiple roles. “It’s the constellation of activity across really large swaths of brain that’s interesting,” said Marc Joanisse, a professor with the University of Western Ontario’s Brain and Mind Institute who specializes in the neuroscience of language and was not involved in the study.
The history of language and brain research dates back to the 19th century, when European physicians Pierre Paul Broca and Carl Wernicke separately noticed that patients with damage to certain parts of the brain had difficulty comprehending or producing speech. But while these areas are now regarded as important to the mechanics of language, the Berkeley map shows that the way language connects to cognition is far more distributed around the entire cerebral cortex.
Dr. Huth and his colleagues stress that they were not setting out to test a particular hypothesis, but rather were simply aiming to create the most comprehensive view of brain activity in response to human speech to date.
The Berkeley team also took into account the fact that some brain regions can be triggered by an emotional response that is more generic than a response linked to a specific set of words, and found little change in their result.
“We don’t think that emotions alone are a big driver of what’s happening here,” Dr. Huth said.
He added that he and his colleagues are in the midst of comparing their semantic map generated by stories with an earlier map the team produced by scanning the brains of subjects while they were exposed to different kinds of images. Another study with bilingual speakers is also under way to look at how the brain preserves meaning and context across languages.Report Typo/Error