Google Isn't Swaying Voters, But It Could
Long before artificial intelligence brings about the singularity, algorithms are having an influence over our most important decisions, including which candidate to back in elections. The danger this could go too far is real and we probably need some well-considered regulatory intervention.
Earlier this week, the U.S. psychologist Robert Epstein published a harsh article about Google's alleged manipulation of its search suggestion feature. When you start typing in Google's search window, it suggests ways to autocomplete the request. Epstein showed, for example, that when a user entered "Hillary Clinton is...," Google suggested finishing the sentence with "is winning" or "is awesome." Other search engines, Bing and Yahoo, completed the sentence differently: "Hillary Clinton is a liar."
Epstein went on to give other examples of the purported bias and claimed that his research showed that the manipulation of search suggestions could "shift between 800,000 and 3.2 million votes" in the U.S. presidential election.
There are several reasons to question Epstein's claims. He has a history with Google, which blocked his website in 2012 because it detected a malware infection, and the dispute escalated into a public fight. I asked Epstein why he became interested in Google's ability to manipulate search results, and he didn't mention the episode, claiming a purely scientific curiosity. Next, the article was published by Sputnik, a notorious Russian state-owned propaganda site. Epstein told me in an e-mail it was "an experiment, and it has worked out well so far: multiple simultaneous translations, indexed everywhere (including on Google News), and they didn't change a single word I wrote." Still, I suspect the choice of publication instantly hurt Epstein's credibility with the mainstream press: Among relatively popular outlets, only the U.K. tabloid Daily Mail, Fox News and Breitbart picked up on the story.
Finally, the most suggestive findings in Epstein's piece are easily refuted. The suggestions are a moving target. I entered "Hillary Clinton is" into the Google search box today and got "Hillary Clinton is dead" and "Hillary Clinton is toast" as the first results. The algorithm did suggest "awesome," too, but then the suggestions for "Donald Trump is" were similar: "Donald Trump is dead" and "Donald Trump is orange" -- but also "Donald Trump is going to win."
The other search engines are harsher on both. Bing's suggestions included completing the Trump request with "a lying racist" and "the antichrist" and one for Clinton one with "a lying crook" and "a practicing witch."
Epstein, however, is definitely on to something. Google isn't hiding that its algorithm picks and chooses among its autocomplete suggestions.
On June 10, Tamar Yehoshua, who leads the company's mobile search team, explained on Google's official blog that the suggestion algorithm "is designed to avoid completing a search for a person’s name with terms that are offensive or disparaging" because results based merely on the frequency of real-life requests were "too often" of that nature.
Yehoshua made no apology for it -- instead, she asked users to let Google know when they ran across a suggestion they considered offensive. I suspect both presidential candidates have staff members savvy enough to contact Google. She also stressed that the suggestions didn't determine the search -- people could ignore them and look for whatever they wanted. That's beside the point: interesting autocomplete options often divert users from their original query.
I doubt anyone except the candidates themselves and their diehard supporters would want Google to suppress the insulting search options. I certainly don't, and neither should voters who are still making up their minds.
Epstein is right about something more profound than the accusations that Google, whose founders and top executives lean Democratic, might be messing with various bits of search algorithms to skew results in favor of Clinton. It's hard to make such charges stick, and as someone who often uses the search engine for work, I can attest that I'm finding everything I want about the contenders, both positive and negative.
What's more important is that Google has the ability to skew results without getting caught.
Last year, Epstein and a collaborator published a paper in the prestigious, peer-reviewed Proceedings of the National Academy of Sciences, proving that manipulating search rankings -- pushing certain results to the top and burying others -- had the power to swing votes. That paper didn't try to catch out Google: The researchers did the manipulating themselves. If Google does that, no one may notice, Epstein correctly points out.
That possibility is scarier than traditional media bias. Partisanship in journalism is usually noticeable, and a broad media palate ensures some balance of viewpoints.
"This is much worse," Epstein said of search engine manipulation, "not only because of the scale of the problem (Google controls 90 percent of search in most countries) but because the manipulations are almost entirely invisible to people. When sources of influence are invisible, people mistakenly believe they have made up their own minds."
All Google has been able to answer to that is that it doesn't do such things. Amit Singhal, who retired as head of Google Search this year, wrote in a column for Politico a year ago:
Google has never ever re-ranked search results on any topic (including elections) to manipulate user sentiment. Moreover, we do not make any ranking tweaks that are specific to elections or political candidates. From the beginning, our approach to search has been to provide the most relevant answers and results to our users, and it would undermine people's trust in our results, and our company, if we were to change course.
Google and the social networks are the main conduits of information in today's world. They can help spread it, but they can also censor it, openly -- as Google does with the autocomplete algorithm -- or covertly.
It may seem that the latter is too risky for the company's reputation since it has competitors, and its search results can always be compared to theirs. Yet such comparisons could prove nothing except that the different search companies' algorithms are based on slightly different views of what's relevant to a majority of users. After all, they have different user bases.
Disclosing these algorithms to governments and submitting to regulation that would ban the unfair tweaking of search tools would be fraught with dangers: Governments are themselves tempted to look for ways of skewing public opinion. And yet it's wrong for society to depend so heavily on a handful of private companies for important information. The European Union has forced Google to make changes to its products; this interference hasn't always been constructive, but it can serve as a starting point for figuring out how to keep the major digital platforms impartial.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
To contact the author of this story:
Leonid Bershidsky at firstname.lastname@example.org
To contact the editor responsible for this story:
Max Berley at email@example.com