Facebook is facing a repeat of its mobile crisis of 2012.
When the company went public more than four years ago, it was a smartphone disaster. It only had rudimentary mobile apps, and it was starting from scratch to rework its advertising business for people who surfed Facebook on their phones. The mobile challenge was the biggest reason Facebook lost half of its stock market value, or roughly $50 billion, in the first few months after its IPO.
Under Mark Zuckerberg's direction, Facebook turned itself inside out to crack smartphones. Within a year or so, Facebook's Marshall Plan for mobile was succeeding. And last quarter, the company generated 84 percent of its advertising revenue from people who viewed Facebook on smartphones and other mobile devices.
Facebook now needs a new Marshall Plan to tackle the trickier twin crises of bogus information spreading like wildfire to Facebook's population of 1.8 billion monthly users and the echo chamber within Facebook's digital walls.
The problems are obvious to most regular Facebook users. The computer models that assemble the Facebook news feed are tuned to show more information they think users will like and click on. That means -- like tuning the TV to MSNBC 24 hours a day -- people can hang out on Facebook and never be confronted by ideas that challenge their own views. The long and bitter U.S. presidential campaign pushed this echo chamber phenomenon into the headlines, along with the related problem of fake news that circulates there.
Of course the rest of the internet, television, news outlets and real-life social circles are also filled with misinformation and self-reinforcing news and opinion. Zuckerberg has made that point and has said Facebook exposes people to more diverse viewpoints. But none of those other sources are as powerful as Facebook, nor as blind to their influence.
Put simply, the Facebook news feed is the most powerful distribution pipeline for information and news ever created. That's why advertisers are spending more than $20 billion a year for their messages to appear there. If Google is the doorway and foyer for the world's information, Facebook is the living room. Earlier this year, executives said users on average spend more than 50 minutes a day on Facebook plus the company-owned Instagram and Messenger apps. In all this time hanging out on Facebook, the company shapes what information we see and don't see in ways that we can't possibly understand.
I would feel better if Facebook recognized this reality and if executives said they understood that their great power comes with great responsibility. Unlike with the mobile crisis, however, Facebook doesn't acknowledge publicly that the information echo chamber and fake news are big problems.
But it turns out members of Facebook's rank-and-file feel differently. A string of recent news articles have reported some Facebook employees are disturbed by the role their company played in creating a filter bubble during the U.S. presidential election and in spreading misinformation written by Macedonian teens chasing a buck or political candidates vying for votes.
Facebook needs to clearly admit that it has a problem. And then Zuckerberg needs to focus the company as he did in 2012 on tackling the newer problem of quality and diversity of information, particularly around heated and high-visibility events such as the presidential election. Facebook said 109 million people in the U.S. wrote or interacted with election-related posts on Facebook in the first nine months of 2016. That is not far off the roughly 130 million people who voted in the election.
This is a trickier challenge than the mobile crisis of 2012. But it doesn’t require Facebook to fact-check the news or hire human editors. Facebook already knows how to clear its news feed of garbage. It changes its algorithms all the time to show more posts it believes are of higher quality.
Companies like Upworthy and Distractify were the web stars of 2012 and 2013 thanks to attention-grabbing but misleading headlines on Facebook like "Watch the First 54 Seconds ... You'll Be Hooked After That, I Swear." Facebook decided it wasn't good for the news feed to be clogged with these types of stories and videos, so it tweaked its computer models and those click-bait posts faded away.
Of course, playing catch-up in mobile was a strategy with a clearly defined solution for Facebook's engineers and business people. Solving the news and misinformation crisis is thornier and has a less obvious impact on user and revenue growth. But for motivation, Facebook can look to Twitter. The abuse of Twitter for harassment and hate speech reportedly spooked potential buyers of the company. Twitter couldn't or wouldn't tackle its troll problem, and that came back to bite the company in its pocketbook.
Solving its mobile threat made Facebook the sixth-most-valuable company in the world. It's time for Facebook to apply its considerable resources and resolve to this newer existential challenge.
This column does not necessarily reflect the opinion of Bloomberg LP and its owners.
Facebook researchers published an article last year that found the company's computer algorithms don't create an echo chamber but rather reflect the information bubble users already make for themselves. Zuckerberg has cited this finding repeatedly although some other independent researchers disputed Facebook's conclusions.
Twitter at least acknowledges abuse is a problem, and it is trying again to tackle it.
To contact the author of this story:
Shira Ovide in New York at email@example.com
To contact the editor responsible for this story:
Daniel Niemi at firstname.lastname@example.org