The truth is, probably, that the brain is simply not adaptable enough for such a radical change. Yes, the brain changes as a consequence of experience, but there are likely limits to this change, a point made by both Steve Pinker and Roger Schank when commenting on this issue. If our ability to deploy attention or to comprehend language processes were to undergo substantial change, the consequences would cascade through the entire cognitive system, and so the brain is probably too conservative for large-scale change.
As someone who believes both in human nature and in timeless standards of logic and evidence, I’m skeptical of the common claim that the Internet is changing the way we think. Electronic media aren’t going to revamp the brain’s mechanisms of information processing, nor will they supersede modus ponens or Bayes’ theorem. Claims that the Internet is changing human thought are propelled by a number of forces: the pressure on pundits to announce that this or that “changes everything”; a superficial conception of what “thinking” is that conflates content with process; the neophobic mindset that “if young people do something that I don’t do, the culture is declining.” But I don’t think the claims stand up to scrutiny.Has a generation of texters, surfers, and twitterers evolved the enviable ability to process multiple streams of novel information in parallel? Most cognitive psychologists doubt it, and recent studies by Clifford Nass confirm their skepticism. So-called mutlitaskers are like Woody Allen after he took a speed-reading course and devoured War and Peace in an evening. His summary: “It was about some Russians.”Also widely rumored are the students who cannot write a paper without instant-message abbreviations, emoticons, and dubious Web citations. But students indulge in such laziness to the extent that their teachers let them get away with it. I have never seen a paper of this kind, and a survey of university student papers by Andrea Lunsford shows they are mostly figments of the pundits’ imaginations…. To be sure, many aspects of the life of the mind have been affected by the Internet. Our physical folders, mailboxes, bookshelves, spreadsheets, documents, media players, and so on have been replaced by software equivalents, which has altered our time budgets in countless ways. But to call it an alternation of “how we think” is, I think, an exaggeration.
Rosenwald quotes Maryanne Wolf, a Tufts University cognitive neuroscientist and the author of “Proust and the Squid: The Story and Science of the Reading Brain,” as saying that students can no longer read long, dense novels.
“They cannot read ‘Middlemarch.’ They cannot read William James or Henry James,” Wolf said. “I can’t tell you how many people have written to me about this phenomenon. The students no longer will or are perhaps incapable of dealing with the convoluted syntax and construction of George Eliot and Henry James.”
… teachers aver that students can no longer read long novels. Well, if we’re swapping stories, I — and most of my classmates — had a hard time with Faulkner and Joyce back in the early ‘80s, when I was an English major.
So what does Willingham think is going on?
A more plausible possibility is that we’re not less capable of reading complex prose, but less willing to put in the work. Our criterion for concluding, “this is boring, this is not paying off,” has been lowered because the Web makes it so easy to find something else to read, watch, or listen to. (I explore the possibility in some detail in my upcoming book, Raising Kids Who Read.) If I’m right, there’s good news and bad news. The good news is that our brains are not being deep-fried by the Web; we can still read deeply and think carefully. The bad news is that we don’t want to.