The EU Doesn't Like Its Google Search Results
Google Inc., the search-engine colossus, scans some 60 trillion Web pages to index information for the hundreds of millions of queries it receives each day. Now, according to the European Union's highest court, users there have the right to ask Google to forget much of this data -- specifically, any bits about them that they would prefer the world not remember.
It's a provocative approach to a legitimate problem, and it certainly reflects differing European and American approaches to privacy rights. Still, the best response to this decision may be: Good luck with that, EU.
The ruling involves a Spanish man whose house had been put up for auction in 1998 to satisfy unpaid debts. The man asserted that the issue had been resolved and that Google's search engine should no longer bring up links to newspaper items mentioning the auction whenever someone looked up his name.
The European Court of Justice agreed. Under current data-protection laws, it said, Europeans already had a right to be forgotten, and consumers could demand that search engines remove links to information that is "inadequate, irrelevant or no longer relevant," except in cases of significant public interest. Notably, the newspaper in question won't have to take anything down: The court argued that in collating information from disparate sources, search engines are assembling what amounts to a personal profile that differs in kind from a news article.
So now more people will assert similar rights. Companies will object. And the EU will move ahead -- messily, spasmodically and perhaps insensibly -- with its grand experiment in protecting privacy.
There's no shortage of legitimate worries about this approach. It threatens free speech. Airbrushing history, even with the best of intentions, is almost always a very bad idea. It will place an arbitrary and costly imposition on search-engine companies. And such a sweeping new right is sure to have unintended consequences -- for starters, by potentially depriving the public of useful information.
Moreover, the administrative complexities -- where exactly does the ruling apply? To whom? How will disputes be arbitrated? -- are deeply confounding. The costs to companies and governments of making such a policy work are incalculable. The consequences of censoring search results could quickly become perverse. And so on.
All that said, this is a clarifying debate. More and more companies are growing rich from collecting and selling consumer data, even as they ignore the human consequences. A generation that grew up using the Internet is discovering that youthful indiscretions may live on much longer than expected -- and prospective employers care more than expected. And plenty of ordinary folks who have gone through tough times or public embarrassments simply want to move on, without the worst episodes of their lives remaining the top hit in a Google search. In other words, the European public is increasingly alarmed.
Divining a "right to be forgotten" isn't the smartest way to respond to such alarm. But make no mistake: The alarm is real. Maybe the EU's approach will yield lessons for the rest of the world. Its logical deficiencies will be aired and debated. So will its mistakes and reverses along the way. Someday, in fact, this approach may stand as an exemplar of what not to do -- and that's not such a bad thing for the world to have.
To contact the editor on this story:
David Shipley at firstname.lastname@example.org