Remember the year 2000, when all appeared to be smooth sailing in the global economy? It was a time of confident predictions of an epochal economic and political renaissance powered by information technology. Jack Welch—then the all-seeing chief executive officer of General Electric—pronounced the Internet “the single most important event in the U.S. economy since the Industrial Revolution.” The Group of Eight highly industrialized nations met in Okinawa in 2000 and declared, “IT is fast becoming a vital engine of growth for the world economy. … Enormous opportunities are there to be seized by us all.” In a 2000 report, the president’s Council of Economic Advisers said, “Many economists now posit that we are entering a new, digital economy that could inaugurate an unprecedented period of sustainable, rapid growth.”
It hasn’t quite worked out that way. The Internet has had a dramatic impact on people’s lives and how they spend their time. It sparks uprisings, makes shopping easier, helps people find their soul mates, and enables governments to collect troves of useful data on potential terrorists—and, apparently, on their own citizens. What the last decade demonstrates, however, is that the information revolution hasn’t generated economic prosperity. It’s tempting to believe an innovation can unlock the secret to high growth. But that way of thinking is almost certainly wrong.
A lot of studies at the turn of the century supported the high-growth scenario—but they often relied on shoddy data and dubious assumptions. In 1999 a Federal Reserve Bank of Cleveland report asserted that “the fraction of a country’s population that has access to the Internet is, at least, correlated with factors that help to explain average growth performance.” The problem is, this conclusion was reached by demonstrating a positive relationship between the number of Internet users in a country in 1999 with the country’s gross domestic product growth between 1974 and 1992.
Economists at the World Bank recently tried to repeat the trick: In a study widely cited by broadband boosters, the authors estimated that a 10 percent increase in broadband penetration in a country was associated with a 1.4 percentage point increase in its growth rate. This was based on growth rates and broadband penetration from 1980 to 2006. But since most broadband deployment occurred well after the turn of the millennium, the most likely explanation is that countries that grew faster from 1980 to 2006 were the ones who could afford a more rapid rollout of broadband—not the other way around.
Innovation in information technology has hardly dried up since 2000. There are now 130 million smartphones in the U.S., each with about the same computing power as a 2005 desktop. Meanwhile, according to the U.S. Department of Commerce, e-commerce as a percentage of total retail sales has continued to climb—online sales were more than 6 percent of the total by the fourth quarter of 2012, up from less than 2 percent in 2003.
Despite this continuing IT innovation, we’ve seen few signs of “an unprecedented period of sustainable, rapid growth.” U.S. GDP expansion in the 1990s was a little faster than in the 1980s—it climbed from an annual average of 3 percent to 3.2 percent. But GDP growth collapsed to 1.7 percent from 2000 to 2009. Northwestern University economist Robert Gordon notes that growth in U.S. labor productivity spiked briefly as well—it was 1.38 percent from 1972 to 1996 and 2.46 percent from 1996 to 2004—but fell to 1.33 percent from 2004 to 2012.
Part of the labor productivity spike around the turn of the century resulted from the rapidly increasing efficiency of IT production: You get a lot more computer for the same cost nowadays. Another part stemmed from considerable investments in computers and networks across the economy—what economists call capital deepening. But even during the boom years it was nearly impossible to see the impact of IT on “total factor productivity”—or the amount of output we were getting for a given input of capital and labor combined.
Within the U.S., investment in the uses of the Internet for business applications led to wage and employment growth in only 6 percent of counties—those that already had high incomes, large populations, high skills, and concentrated IT use before 1995—according to a recent analysis by Chris Forman and colleagues in the American Economic Review. Investments in computers and software did yield a return for most companies, but it wasn’t anything special. And although the Internet has made it easier for people to look for jobs, it hasn’t made much of a difference in reducing unemployment. Betsey Stevenson, the newest member of President Obama’s Council of Economic Advisers, has found that the use of the Internet in job searches may be a factor behind employed people switching jobs more often, but she sees little evidence that the Web helps those out of work find jobs faster.
So what happened to the promised Internet miracle? To understand why everyone using the Web doesn’t automatically lead to growth, think about television in the 1970s. It was broadcast to the home for free; all we had to pay for was the set and the electricity to run it. With only a small expenditure, we spent hours a day watching TV. Today, 209 million Americans spend an average of 29 hours a month online, according to Nielsen; the 145 million U.S. Facebook visitors spent an average of six hours last January on that site alone. And each month, YouTube users spend 6 billion hours watching videos—more than 600,000 times as long as it took Michelangelo to paint the Sistine Chapel. We pay for the computer and Internet connection, but for all of the hours we spend doing it, surfing the Web is a cheap form of entertainment.
One reason we may not have seen a huge impact of the Internet on productivity is that once we find a job, we spend quite a lot of time surfing at the office. Some of that time is used to look for a different job, apparently, but 90 percent of workers with a PC also say they visit recreational sites. Almost the same number say they send personal e-mails, and more than half report shopping. The reality may be worse: Tracking software suggests that 70 percent of employees visit retail sites, and more than one-third check out X-rated pages. Even if the Internet allows us to do more work in less time, we’re using those extra hours to check out pictures of Kate Upton modeling swimwear or cats playing the piano rather than to produce more widgets for the boss.
Robert Gordon offers another possible explanation for the limited productivity impact of IT: The big breakthroughs occurred long ago. He notes that telephone operators disappeared in the 1960s as the first robots were arriving in factories. Reservation systems, electronic calculators, and bar code scanners spread in the 1970s and 1980s. Business-to-business invoicing through electronic data interchange has been around since 1965—when Holland America Line started sending shipping manifests through telex messages that were automatically converted into computer data. Online shopping may be a comparatively recent phenomenon, but it isn’t a very important part of the overall productivity story.
The Internet has unquestionably transformed sectors of the economy. Some industries—not least print media, booksellers, and broadcast TV—will continue to see dramatic upheaval. But technology’s biggest impact has been to deliver a form of entertainment more addictive than watching reruns of Friends or talking to real friends in real life. If we’ve learned anything over the past 10 years, it’s that there’s no simple Web-based solution to an economy in the productivity doldrums.
That may be why Jack Welch has moved on to natural gas. Last year he announced that “the gas that we have found is in the first inning—it’s like the Internet in 1990. This is the first inning of a great American century.” Let’s hope he’s right this time.