How Facebook Hopes to Influence the Election

The social network knows that increasing voter engagement is good for business.
Photograph by Wavebreakmedia Ltd.

If you log onto Facebook today—and more of us will, for better or for worse, than the number who will pick up a newspaper—you’ll see a perky little red, white, and blue e-button bearing one word, all caps—"VOTE." No, Mark Zuckerberg has not figured out a way to help users of his social network bypass the lines at polling stations. Instead, clicking on the button allows you to share the voting experience with Facebook friends, and routes you to, where a user can find polling locations and times, as well as a sample ballot.

The project is a collaboration between Facebook, Pew Charitable Trust’s Voter Information Project, the Internet Association, Google, Amazon, AOL, Bing, Reddit, Foursquare, Lyft, Twitter, Tumblr, and more. It goes to show that Facebook’s got skin in the voting game—and, even more than skin, clout.

Since 2008, Facebook has tested and deployed a variety of tools—the widget you see today it calls a "voter megaphone"—to try and encourage its users to vote. In 2012, the company’s data scientists published a paper with a number of academics in "Nature," titled "A 61-Million-Person Experiment in Social Influence and Political Mobilization." It showed that positive social pressure convinced people to vote. Their conclusion: “It is possible that more of the 0.6 percent growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook."

Or as a company spokesperson put it today: “We have learned over the past few years that people are more likely to vote when they are reminded on Facebook and they see that their friends have voted. We’re proud of this. Our effort is neutral—while we encourage any and all candidates, groups, and voters to use our platform to engage on the elections, we as a company have not used our products in a way that attempts to influence how people vote."

Facebook reports that, between July 10 and yesterday, November 3, "28 million people on Facebook in the United States made 184.2 million interactions (likes, shares, comments) regarding the midterm elections." California, the nation's most populous state, has been the very most engaged in conversation volume on the social network, while Florida Governor Rick Scott has been the most talked-about candidate. 

Now that there is a massive audience for political chatter on Facebook, the biggest question for the company is how to manage it. Earlier this year, Facebook revealed that it had manipulated the content posted on 689,000 users’ home pages—their news feeds—to determine whether or not it could affect peoples’ moods. Not surprisingly, it could. Facebook, all science fiction-like, called the process “emotional contagion.” Their study of positive and negative content concluded that "emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

A few days after its conclusions came out, the scientific journal that published the study—a joint undertaking by Facebook, Cornell, and the University of California—apologized. It had manipulated users without prior consent. But Facebook isn’t obligated to keep to the same established ethical principles of academic investigation. It’s a business, and it has only to keep to the terms of service it defines. Consent isn't its data scientists' chief concern. Its business is doing this all the time.

As a business strategy, Facebook seems to have figured out that increasing voter participation also helps drive engagement on the social network, which some would say represents a win-win situation for the process of democracy. A spokesman for Facebook told the Huffington Post, "We believe that encouraging civic participation is an important contribution we can make to the community." Of course, only now are we learning about the secret experiments Facebook made on past election days. If and when they decide to make more their contribution more partisan—or to push the platform of Zuckerberg's own lobbying group,—we may not know that, either. 

Before it's here, it's on the Bloomberg Terminal.