Last week, Canadian author Margaret Atwood thrilled her 285,000-plus Twitter followers by defending their kind as “dedicated readers” who are boldly exploring new frontiers in literacy. Calling the Internet in general “a great literacy driver,” she defended even the most minimal form of screen-based reading as an unalloyed good – “because reading is in fact extremely interactive from a neurological point of view,” she said. “Your brain lights up a lot.”
But many of those who have studied what lights up when people read have come to sharply different conclusions. Basing their concern in part on graphic physical evidence of how brain cells adapt to meet new demands – and wither in the absence of such stimuli – a growing chorus of neuroscientists worry that the “expert reading brain” will soon be as obsolete as the paper and ink it once fed on. And the thing that replaces it (“the Twitter brain”) will be a completely different organ.
In Britain, University of Oxford neuroscientist and former Royal Institution director Susan Greenfield revealed a far different vision – one that could have come straight out of an Atwoodian dystopia – when she warned that Internet-driven “mind change” was comparable with climate change as a threat to the species, “skewing the brain” to operate in an infantalized mode and creating “a world in which we are all required to become autistic.”
Less dire but no less pointed warnings have come from Maryanne Wolf, director of the Center for Reading and Language Research at Tufts University in Massachusetts. “I do think something is going to be lost with the Twitter brain,” she said in an interview.
Wolf first warned of the Internet’s threat to literacy in her book Proust and the Squid: The Story and Science of the Reading Brain. “I was thunderstruck by what I saw and what I realized from a neuroscience viewpoint,” she said. “… The medium is really important in terms of its effects on a reading circuit [in the brain]”
Both scientists back their assertions with detailed images showing physical differences between the brains of expert and non-expert readers, with the affected cells in the readers’ brains much more thickly branched and intricately interconnected than the same cells in non-reading brains. They emphasize what Wolf calls “the tabula rasa nature of the reading brain,” the fact that there is no genetic map for reading and that brain cells must physically adapt to make the new circuits that enable the mind to read – especially those that help it achieve knowledge from deep concentration in the pages of a non-networked book.
The hyperlinked, text-messaging screen shapes the mind quite differently than the book, according to Wolf. “It pulls attention with such rapidity it doesn’t allow the kind of deep, focused attention that reading a book 10 years ago invited,” she says. “It invites constant change of attention, it invites multitasking. It invites, in other words, a kind of triage of attention.”
Such a skill is certainly necessary in the 21st century, she adds. “But it does not have a place in the deepest kind of immersive thought.”
More research needs to be done to detect the ultimate effects of such neurological changes, according to Wolf. Despite declining book sales and vastly increased competition for the attention of young people, one influential U.S. study found a marked increase in reading.
The link between reading skill and success in life is so strong that some U.S. states are said to use primary-school reading scores to forecast future prison populations. Decades of research in cognitive science has also shown that one of the key alleged advantages of screen-based learning – the ability to multitask – is “techno-hype,” according to reading expert Keith Stanovich, Canada research chair of applied cognitive science at the University of Toronto.
“Distractions such as texting and simultaneous things to do on the screen will ensure that no deep reading takes place,” Stanovich said in an e-mail. “That’s why book reading is best for what Wolf calls deep reading. The idea that children looking at screens are taking in, at a deep level, information from many different streams is a falsehood.”
So what about computer games, which have been shown to increase IQ? According to Greenfield, they can boost IQ, but not knowledge. “Information processing is not knowledge,” she said during a presentation at Britain’s Hay Festival last summer.
Greenfield has drawn criticism from fellow scientists for linking the rise of autism to the spread of the Internet. But at Hay, she speculated that a recent threefold increase in prescriptions of the drug Ritalin to treat hyperactivity could “just possibly be due to the fact that you’re placing a young brain in an environment that mandates a short attention span. Because that’s what screens do.”
The question of what can be done to combat them is equally fraught. “No cry for the vigilance of the species is going to do a single bit of good,” Wolf acknowledges. Going back to print is not a solution. “The world is digital already.”
Her solution is to join the famous Media Lab at the Massachusetts Institute of Technology to help design a tablet computer to teach children reading skills “If I can’t ask society to stop and examine before it lurches, then I must enter the mouth of the medium to make sure the medium itself is used to correct its flaws,” she says.
In the meantime, she worries, huge monuments of human culture threaten to disappear from consciousness. The “demise” of 19th- and early 20th-century literature is continuing, according to Wolf, who says she has been overwhelmed by mail from educators confirming her fears. “They’re all talking about how their students don’t have the patience any more,” she says.
So goodbye George Eliot, Henry James, et al.
“Syntax is a reflection of the convolution of thought,” says Wolf, who studied literature before turning to linguistics and studying under Noam Chomsky. “As we become too impatient to read complicated syntax, I wonder out loud about the capacity for handling the complexity of issues that are out there in life, with all their semicolons.”
Follow us on Twitter: