How Snapchat Has Kept Itself Free of Fake News
To get a sense of how rough things are for Snapchat, consider the Dancing Hot Dog. In June the disappearing-messages app began letting users overlay their videos with a squat-legged cartoon wiener. It quickly became an internet sensation. Not that investors cared.
On parent company Snap Inc.’s quarterly earnings call in August, Chief Executive Officer Evan Spiegel boasted that the break-dancing tube steak had been seen more than 1.5 billion times on Snapchat, calling it “likely the world’s first augmented-reality superstar.” This goofy glory didn’t do much to soften the bad news: The company had brought in less than $200 million in ad revenue and lost more than $400 million. After Spiegel answered a question about Snapchat’s slowing user growth, an analyst, who apparently didn’t realize his line was live, said, “I didn’t even understand his response,” sounding as if he were about to burst out in laughter. This was before the 27-year-old CEO said investors should have faith in Snap’s chances against much bigger rivals because “we’ve always been last to market.”
In the runup to its initial public offering in March, Snap was considered one of the tech industry’s most promising unicorns. Now its share price is down almost 50 percent from a Day-Two peak, and on Oct. 20 the company said it was laying off 18 recruiting staffers, signaling that its 2,600 employees are likely to meet far fewer new co-workers. Analysts say the problem is the near-monopoly power of Google and Facebook Inc., which together control the most popular social networks and most of the mobile ad market. It’s also tough for Snapchat to add to its 173 million daily users when Facebook, with a daily audience almost eight times that size, relentlessly copies its best features.
For people who expected Snap to be the next Facebook, its stumbles appear disastrous. But while Snap hasn’t figured out how to be a profitable advertising business, it’s proving to be a much more competent media company than either Facebook or Google—and not just because it knows from dancing meats.
Since September, Facebook and Google have acknowledged selling political ad space to Kremlin-affiliated groups that spread false stories about the 2016 U.S. presidential election. While company representatives prepare to testify before the House and Senate intelligence committees on Nov. 1, Robert Mueller, special counsel for the U.S. Department of Justice, is reviewing the ad buys for evidence that Russian agents colluded with the Trump campaign. There are signs of Russian activity on almost every American social network of any consequence, including Twitter, Reddit, Tumblr, Pinterest, LinkedIn, and even the smartphone video game Pokémon Go. (Bloomberg LP is developing news programming for the Twitter service.) Snapchat, however, has found no evidence of political ad buys by anyone in Russia. In fact, Snapchat appears to have no fake news at all.
The secret? “Humans,” says Nick Bell, Snap’s vice president for content. “We only work with authoritative and credible media companies, and we unashamedly have a significant team of producers, creators, and journalists.”
Whereas Facebook deliberately blurs the line between personal status updates, news articles, and ads—sticking all three in its constantly updating, algorithm-driven News Feed—Snapchat has taken a more old-fashioned approach. The app’s news section, Discover, is limited to professionally edited content, including dozens of channels maintained by old-media outlets such as the Wall Street Journal, the Daily Mail, the Economist, and People. Snapchat’s coverage of college campuses is overseen by a group of student-run daily newspapers. Its three regular newscasts come from CNN, NBC, and E!. Peter Hamby, a reporter hired from CNN, anchors its weekly in-house political documentaries.
As with Facebook and YouTube, part of Snapchat’s appeal is watching videos unmistakably shot on cell phones by regular people. Sometimes those videos become newsworthy; when that happens, Snapchat includes them in Our Stories, short-form news updates that combine user-generated material with professional camerawork. But unlike newsy user-created videos on Facebook and YouTube, Snapchat’s are vetted before they can reach a wide audience. Staff reporters and producers edit Our Stories, check facts, and clear the stories with lawyers, like a traditional broadcast team. Much of Snap’s revenue comes from ads that appear in its professionally curated videos, and the company is betting that trustworthy content will ultimately prove more appealing to viewers and advertisers alike.
While Snapchat has embraced its role as a curator of news, Facebook has strenuously objected to suggestions by members of Congress—or anyone else—that it’s in the media business. “When you cut off speech for one person, you cut off speech for all people,” Facebook Chief Operating Officer Sheryl Sandberg told news site Axios in October. “The responsibility for an open platform is to let people express themselves. We don’t check the information people put on Facebook before they run it, and I don’t think anyone should want us to do that.” Translation: You, the consumer, bear the responsibility for distinguishing which items in the News Feed are actually news, and which are paid messages from whoever can afford them.
Sensing a rare chance to make Facebook look bad, Snap is playing up its commitment to traditional news values. If it can steal a few ad dollars from its competitors and rectify its earnings setbacks, so much the better, but the company says recent events are simply proof that it’s been on the right path all along. “From the very beginning,” says Chief Strategy Officer Imran Khan, “we’ve felt a responsibility to make sure our community knows where their news and information is coming from.”
Even when Snapchat was mostly known as a sexting app, Facebook appeared to be watching it closely, with more than a little jealousy. Since 2013, when Spiegel rejected Mark Zuckerberg’s $3 billion buyout offer, the joke in Silicon Valley has been that Snap runs Facebook’s R&D lab. Facebook’s four main products—its namesake service, Messenger, Instagram, and WhatsApp—all include features almost indistinguishable from ones that made Snapchat famous, such as self-destructing messages, a video-diary feature called Stories, and augmented-reality gimmicks along the same lines as the dancing hot dog. Instagram Stories, rolled out last year, attracts close to 50 percent more viewers than all of Snapchat. (Snap says those numbers don’t tell the whole story.)
Despite their products’ similarities, the corporate cultures of Facebook and Snap are very different. Zuckerberg, the earnest-sounding if awkward Silicon Valley geek, speaks in lofty terms about Facebook’s ambition to “bring the world closer together.” He makes policy proclamations in the mode of a world leader and regularly talks up Facebook’s voter registration efforts during the 2016 election. Spiegel is a creature of Los Angeles, more likely to crack wise than rhapsodize about the grand sweep of history. He calls Snap “a camera company” and doesn’t talk publicly about politics.
These different aims helped define the companies’ divergent philosophies about the media. Where Facebook treats news as just more crumbs for its enormous content maw, leveraging its audience size to compel press outlets to give up their product for free, Snap has aggressively courted conventional gatekeepers and tastemakers. “This is not social media,” a 2015 company blog post reads. “Social media companies tell us what to read based on what’s most recent or most popular. We see it differently.”
Every time you visit a Facebook property, the company’s algorithms crunch through an enormous trove of data about you and everyone you know, everything you’ve clicked on, and everything that people like you are likely to click on; then you’re shown the posts and ads you’re most likely to consume and share. This is how Russian agents were able to reach, by Facebook’s estimation, 10 million Americans with just $100,000 worth of ads.
Snap, on the other hand, is all about privacy. Instead of encouraging users to build big audiences by interacting in public view with people they don’t know in real life, Spiegel has said he wants you using Snapchat mainly to riff with “your very close friends.” This has produced a whiff of unsavoriness—Snapchat’s disappearing messages have been used by child pornographers and insider traders—but it has also largely insulated the service from fake news.
Snapchat Stories can’t be publicly shared unless you take a screenshot and post it on, say, Facebook or YouTube. On Snapchat itself, even the public Stories feature is designed to make individual clips hard to find unless you know exactly what you’re looking for. Snapchat doesn’t use algorithms to try to keep people clicking on new material; the only posts you see when glancing at the app have either come from your friends or been vetted for Our Stories. As a result, posts by individuals almost never reach more than a few hundred viewers. “Snap is first and foremost about your friends, not about building a large following,” says Bell, the content chief. “If an individual story gets hundreds of thousands of views, a team of our editors looks at it, including me.” Any post with more than a few thousand views is typically reviewed by at least one Snapchat journalist and, if necessary, fact-checked for inclusion in Our Stories.
Snap has proven it can get a scoop without sacrificing reliability. During the white nationalist rally in Charlottesville, Va., in August, Bell’s team assigned a producer in New York to create dispatches for Our Stories. That meant scanning public Snapchat posts from within a few blocks of the protests, and gathering video and interviews from Snapchat-using journalists on the scene. Around 3 p.m. on Aug. 12, a short video clip posted on Snapchat appeared to show police arresting James Alex Fields Jr., the man who allegedly drove a car into a crowd of counterprotesters, killing 32-year-old Heather Heyer and injuring 19 other people. On Facebook, Twitter, or YouTube, the footage would have gone viral before it could be confirmed; and, in fact, screenshots of the Snapchat video appeared on other social networks almost immediately. But rather than post the clip widely, a Snapchat producer spent hours comparing its time and location data with other users’ footage of the attack, and repeatedly called and texted Charlottesville Police Department officers in an attempt to verify the arrest.
The clip appeared in Our Stories at about 7 p.m., after the Snapchat producer spoke to police. Even then, the producer replaced the user’s caption (“got em”) with a more cautious statement that the video “appears to show an arrest” of the suspected attacker. Snapchat anchor Hamby and the head of original content, Sean Mills, both signed off on the post. “It’s not just humans making judgment calls,” Hamby says. “We make phone calls.”
That this qualifies as a boast is a testament to how poorly other tech companies have acquitted themselves in presenting news. In the days after the Aug. 12 attack, Facebook, Google, and Twitter were flooded with evidence-free stories suggesting that Charlottesville was a “false flag” attack perpetrated by left-wing extremists, Jews, and/or extreme left-wing Jews. Later that week, Facebook began deleting links to an article published by the neo-Nazi website the Daily Stormer that circulated widely on the social network. The article called Heyer a “fat, childless … slut.”
In October, after the murder of 58 concertgoers at a Las Vegas country music festival, Google News featured a story that identified an innocent man as the shooter. The publisher of that story: 4chan, an anarchic online forum known for allowing racism, misogyny, conspiracy theories, and trolling. Google apologized, promising it would “make algorithmic improvements to prevent this from happening in the future.”
That kind of material wouldn’t make it far on Snapchat, Hamby says, because “we’re in essence a walled garden.” As an example, he says, if somebody tried to post a phony video of a shark swimming through the streets of a hurricane-devastated city, Snapchat’s editors would catch it and make sure it didn’t find a wider audience on the service. “You can’t introduce a shady article without hitting a layer of editors,” he says. The urban shark example wasn’t hypothetical. In September a video that purportedly showed sharks flagrantly violating Miami traffic laws after Hurricane Irma racked up thousands of mentions in Facebook’s News Feed, despite being repeatedly debunked by Snopes.com and others. It has been viewed more than half a million times on YouTube.
As they’ve tried to ward off Washington regulation, Silicon Valley executives have made Russia’s meddling in the 2016 election sound like a problem of almost unimaginable scale. Zuckerberg’s halting response to concerns about “election integrity,” as he put it in a Facebook Live video on Sept. 21, included a nine-point plan and a caveat about the enormity of the task. “We are in a new world,” he said. “It is a new challenge for internet communities to deal with nation-states attempting to subvert elections.” Facebook’s own disclosures, however, suggest that the Russian tactics were hardly more sophisticated than those of 4chan trolls and could’ve been easy to spot had the company made a concerted effort.
Many of the Russia-linked ads and Facebook posts, which the company hasn’t released but which have been made public by the New York Times and others, included the sorts of grammatical errors and spelling mistakes you might associate with your ill-informed cousin or perhaps a non-native English speaker. In some cases the buyers paid in rubles.
Putting human eyes on user-generated content isn’t cheap, of course, but Snap has managed to do it on a reasonable budget, thanks to design decisions that limit the spread of propaganda. Its entire content operation is staffed by fewer than 100 people. Adopting similar changes might cost Facebook, especially at its larger scale, but the adjustments wouldn’t have to be extreme. For now the company has said it plans to add 1,000 employees to keep an eye on its ads. That’s a start, but Facebook could also easily create its own Snapchat-like news product, walled off from the rest of the service, overseen by editors, and populated exclusively by reputable news organizations.
Facebook had something like this, a human-curated sidebar called Trending Topics that lived next to its News Feed. But the company laid off the small team of a dozen or so Trending Topics editors in August 2016, after conservatives claimed the section was biased. In retrospect, the decision to fire editors months before an election in which foreign agents gamed its algorithms was, to put it gently, poorly timed. Facebook and its peers can use all the human help they can get.