'The Shallows': Is the Net Fostering Stupidity?

(The story has been updated to reflect the correct rating.)

The Shallows: What the Internet Is Doing to Our Brains
By Nicholas Carr
Norton; 276 pp; $26.95

Have you ever worried about your annoying need to go to Google (GOOG) because you couldn't remember something? Have you wondered about your constant desire to check your e-mail, Twitter account, or favorite blog rather than read a great book or enjoy a beautiful day?

If you haven't yet, you will after reading Nicholas Carr's The Shallows: What the Internet Is Doing to Our Brains. The book, an expansion of his 2008 article in The Atlantic, makes a compelling case that such fears are justified. Our constant inundation with electronic stimuli, he argues, is actually changing the brains's wiring. As we choose among all those enticing Web links, process blinking online ads, or get our Facebook fix, we are also sapping our neurological ability to remember facts or pay attention long enough to fully digest what we read. Those who didn't experience life before Google—or have already forgotten it—may even have a harder time generating the same empathy or interest in their fellow man.

If that sounds like an apocalyptic anti-technology rant, give Carr a chance. A prolific blogger, tech pundit, and author, he cites enough academic research in The Shallows to give anyone pause about society's full embrace of the Internet as an unadulterated force for progress. One study he refers to shows that people watching a CNN news spot retained far more information without the headlines scrolling by at the bottom of the screen. Another shows that the more links there are in an article, the lower the comprehension of the reader. A third indicates that our brains automatically overvalue information simply because it's new. Carr quotes neuroscientist Michael Merzenich, who says we are "training our brains to pay attention to the crap." Perhaps most scary, the Brain & Creativity Institute at the University of Southern California found that while the brain's response to physical pain shows up immediately on neurological scans, people must pay attention for a longer time before their brain shows telltale signs of caring about someone else's pain. Carr's takeaway: "The more distracted we become, the less able we are to experience the subtlest, most distinctly human forms of empathy, compassion, and emotion."

Carr lays out, in engaging, accessible prose, the science that may explain these results. One key is the brain's shortage of so-called working memory, the mechanism that sifts through the avalanche of real-time information that swamps our senses and selects the important bits for incorporation into our long-term memories and insights. It turns out there's only room for two to four items at a time in this neural way station—not nearly enough to keep up with a website packed with links, videos, and RSS feeds. While the mind of the book reader considers what's important at its own pace, the Netizen's brain has to choose much more quickly and haphazardly. As a result, our ability to make the most of the input is diminished, and we become "mindless" consumers of data. This may also explain why sometimes it becomes harder to concentrate the longer you spend browsing the Web.

Unsurprisingly, many of the Internet companies that we have come to live by don't fare well under Carr's gaze. While Twitter is a powerful tool for good in the hands of protesters in despotic lands, he writes that its very motto—"Discover what's happening right now"—might as well be an advertisement for a neurological heroin that trains your brain to be even more distracted. And while Google's geeky founders may truly believe in their stated objective "to organize the world's information and make it universally accessible and useful," Carr argues that Google is, "quite literally, in the business of distraction." After all, the more links you click on, the more money the company makes.

While Carr believes the Internet is a revolutionary tool for finding information, he also suggests that it may be a dangerously powerful impetus to groupthink. As evidence, he suggests a study by the National Institute of Neurological Disorders & Stroke suggesting that multitasking makes people "more likely to rely on conventional ideas and solutions rather than challenging them with original lines of thought." And a University of Chicago study showed that academic papers began citing fewer sources, not more, after publications began going online.

Taken to its extreme, Carr's arguments suggest the Internet Age is less likely than previous eras to produce Einsteins, Edisons, and Tolstoys. Such extraordinary people were not forever distracted from their work by 140-character bursts or incessant YouTube videos. Nor were they tempted to throw their semi-finished work out on the Web, safe in the knowledge that they could easily update it later. Indeed, Carr argues, they owe their mastery, in part, to the difficulty of achieving it. Absorbing entire hard-to-find texts—rather than forever Googling random facts—may have been a key to their development.

Even though the book is only now hitting shelves, many Internet devotees will undoubtedly take its thesis as pure quackery. Presented with Carr's arguments, Theodore Gray, co-founder of search engine provider Wolfram Research, told me: "It's very easy to look back and point to Voltaire and Einstein and great literature and figure we're all just ignorant fools compared to the past. There are a lot of people who think deeply. Thanks to the Internet, they are able to think more deeply about more things." Ray Kurzweill, an author, entrepreneur, and futurist, also thinks Carr's argument is bunk. "We have many more people engaged in thinking and writing about issues than ever before. There are 200 million blogs in China alone—despite the censorship."

These critics certainly have a point. The best of us will benefit hugely from the Internet. As with any form of new technology, how you use it dictates its usefulness. Regardless, Carr seems to understand that his arguments will not slow down the Netification of society. Nowhere in the book does he bother to offer any actual prescriptions for the problem he sees.

Carr, however, fears the Internet will actually cause the brain to take its first step backward in centuries. Our cave-dwelling ancestors were consumed with immediate concerns—run from the lion, kill the mastodon, get out of the rain. Then various media provided an abstract way of thinking about the world. The map helped us explore other lands, establish trading routes, and draw up battle plans. The clock and calendar raised our productivity by enabling us to organize our time. Then came writing. Over time, especially after Gutenberg, the book turbocharged our ability to think conceptually and deeply about the world around us.

Americans now spend 8.5 hours a day frenetically interacting with their PCs, TVs, or, increasingly, the smartphones that follow them everywhere. In the process, writes Carr, we are reverting to our roots as data processors. "What we're experiencing is, in a metaphorical sense, a reversal of the early trajectory of civilization: We are evolving from being cultivators of personal knowledge to being hunters and gatherers in the electronic data forest."

Whether Carr is right or not isn't really the point. Other than obvious problems such as child porn and online fraud, there's been very little hesitation or contemplation about the side effects of the Net as we race to take advantage of its bounties. At the very least, Carr will have done an important service by making people think just a bit differently the next time they find themselves Twittering their hours away. It may be more than a waste of time. It may also waste our brains.



Number of hours per day Americans spend interacting with a PC, TV, or smartphone


Number of times per hour that American office workers check their e-mail


Average number of monthly texts sent and received by American teenagers, fourth quarter 2008

Data: Ball State University; The Shallows; Nielsen

    Before it's here, it's on the Bloomberg Terminal.