Internet Companies Shouldn't Be Censors
PayPal blocked an account set up by Russian activists seeking to collect funds for the publication of a report on Russia's involvement in eastern Ukraine. The U.S. online payment company explained that it didn't "provide the option of using its system to receive donations to political parties or causes in Russia." It's just the latest example of an Internet giant cast in a role for which it is ill-equipped.
Companies such as Google, Twitter, Facebook and even Paypal are ruling on matters of free speech. The decisions are made by business executives, who are essentially performing the function of editors in traditional media. But their organizations aren't professional media: They are conduits for other people's content, products and money, and their ability to make value judgments should be as limited as a water company's power to shut off service on a whim.
The Internet companies' internal policies are often tested by the way they are used outside the U.S. When Facebook recently collected questions for a regular Q&A session with founder Mark Zuckerberg, a question from an Israel-based Ukrainian collected 45,000 "likes" -- the most ever for such a query, according to Zuckerberg himself. The user asked why many popular Ukrainian bloggers' posts criticizing Russia and Russians had been taken down: Were Facebook posts from the former Soviet Union moderated in Russia? Zuckerberg explained that the posts contravened Facebook's policy on hate speech and that the company didn't have a Russian office. "We try to have folks who speak that language review it," he said. "We have a European headquarters in Dublin where we have folks who speak a lot of different languages from around the world and who look and different content and that's what we did here."
He added that he'd personally reviewed some cases and stood by the decisions. There was just one mistake, Zuckerberg said: in one case, people whose posts had been taken down for hate speech received messages saying the reason was nudity. "A software glitch," the Facebook chief said.
There's no reason, however, for Ukrainians to trust Facebook employees -- or buggy software -- to make the right judgments: What if the Russian speakers at the Facebook office are all fans of President Vladimir Putin? And why is Zuckerberg himself qualified to judge posts in Russian and Ukrainian -- or does he trust Facebook's terrible automated translation?
Facebook account holders are the commodity the company sells to advertisers. In exchange, people expect to be able to post what they think. If they hate somebody else's posts and comments, they can block that person. That Facebook also takes on the function of judge and jury opens it to abuse of its policies by, say, the Kremlin's paid trolls, who are known to file takedown requests that often succeed.
Twitter, too, has taken flak for selective censorship after it started actively removing content posted by Islamic State sympathizers last year. James Ball wrote in the Guardian on that occasion:
Whether you are an ardent First Amendment advocate or a passionate believer that networks must do more to police their backyards, the worst of all possible worlds for the flow of information is one in which we shift from the rule of democratic law to one governed by the arbitrary, inconsistent and perhaps kneejerk rulings of a tiny group of large companies.
The problem, however, is that sometimes laws require companies to take on the role of judges. Google's year-long European ordeal with the "right to be forgotten" is a case in point. Peter Fleischer, Google's global privacy counsel, was quoted in the Wall Street Journal this week saying the company was required "to play a role that we never asked to play—and don’t want to play."
Requests for the removal of sensitive information are first put to "a large team of lawyers, paralegals and engineers who decide the easy cases," the Journal reported. Then the hard ones go to a panel of Google managers, who hold discussions and sometimes even vote. These not-so-randomly-selected citizens decide, for example, whether to remove from Google's index sites that report an old sex-crime conviction. Such decisions can affect people's lives in numerous ways, and they are made outside of any established legal procedure by people who have no meaningful mandate to make them.
These judgments are more dangerous, and more final, than editorial ones. Information thrown out by the editor of one news organization can find a home and an audience at another. But the big social networks and search engines are near-monopolies precisely because they are Noah's Arks that let everyone in. If you're removed from them, you don't exist.
Now money transfer companies appear to be getting into the "editorial decision" business, too. Paypal judges an account "political" and fears it will complicate its relationship with the Russian government.
Courts of law exist to handle harassment, defamation, illegal money transfers and every known kind of abuse. Those who have a problem with any activity for which Internet companies are conduits should sue those who carry out that activity. It's not as easy as hitting a button, but then it shouldn't be when blocking unwanted content may solve the problem just as easily. People who apply to Google to have links removed ought to deal directly with the offending sites, through the courts if necessary. So should those who complained about the anti-Russian posts of Facebook. And whoever complained about the Paypal account should simply slink away, because it's hard to imagine a court that wouldn't throw out such a complaint.
Internet companies are, essentially, utilities. It would make sense to regulate them so they are neither forced nor tempted to referee disputes.
This column does not necessarily reflect the opinion of Bloomberg View's editorial board or Bloomberg LP, its owners and investors.
To contact the author on this story:
Leonid Bershidsky at firstname.lastname@example.org
To contact the editor on this story:
Max Berley at email@example.com