Until recently, the big news in the world of news was that Facebook was retreating from journalism. After an unexpected dip in the personal sharing that is its core business, plus a mini-scandal involving allegations of political bias in how it displayed content from conservative websites, Facebook said it was updating its algorithm to prioritize wedding announcements and baby photos over postings by media companies. “Friends and family come first,” the company said in a June 29 blog post.
And when Chief Executive Officer Mark Zuckerberg announced the Facebook Live video function, he presented it as a platform for life’s small trials and triumphs. “You can feel like you’re really there with your friends,” he said on April 6, when the service launched. Among the videos he praised: a young man’s haircut as it happened, a woman skiing downhill with her kids, and a zoo camera trained on some baby birds. “Everyone is tuned in, watching these cute bald eagles, wondering what’s going to happen,” he said, with a wide grin. “It’s kind of a new thing.”
The sentiment suddenly feels quaint. On July 6, during what should have been a routine traffic stop, a police officer in suburban Minneapolis fired multiple shots at Philando Castile, a 32-year-old black man. Seconds later, as he slumped, bloody and gasping for air next to her in the car, Diamond Reynolds, his girlfriend, opened the Facebook app on her smartphone and pressed the Go Live button. She narrated calmly, panning from the gun pointed in her direction to her dying companion, and even kept the broadcast going as she was thrown to the ground, cuffed, and taken into custody. “It’s OK, Mommy,” her 4-year-old daughter could be heard saying in the back seat. “I’m right here with you.”
The next day, Facebook was used by witnesses in Dallas to broadcast live footage of the attack that left five police officers dead and seven others wounded at a Black Lives Matter protest organized in response to the shootings of Castile and Alton Sterling in Baton Rouge, La. In the aftermath of the violence, Facebook Live was inescapable, as public figures took to the platform to process in real time what had happened. “If you are a normal white American, the truth is you don’t understand being black in America, and you instinctively underestimate the level of discrimination and the level of additional risk,” Newt Gingrich told CNN commentator Van Jones in a Facebook Live interview.
Broadcasting video in real time on smartphones isn’t new. Twitter’s Periscope made headlines last year when it enabled users to stream parties from the South by Southwest festival in Austin and unauthorized coverage of the Oscars. But none of the companies that have rolled out live video have Facebook’s scale or technological know-how. With 1.65 billion users—more than half of whom log in every day—footage can quickly command an enormous audience. And live videos are archived, adding even more viewers. Reynolds’s video of Castile’s death drew more than 5 million views on Facebook within a day of the incident and was rebroadcast on several news channels.
The push for live video accelerated in February at an all-hands meeting at Facebook’s campus in Menlo Park, Calif., when Zuckerberg said the format would be central to the company’s future. The new feature represents a technical challenge, taxing cell phone networks and Facebook’s own servers—even Zuckerberg’s own videos have cut out at times. Converting the video right away to work on hundreds of different devices at once is anything but simple. When a user goes live, Facebook must ensure it can process the footage, regardless of the source, and transmit it instantly. “The infrastructure for live-streaming is hard,” Chief Product Officer Chris Cox said in a 2015 interview. “It’s something we’ve been working on for a long time.” Facebook’s custom-manufactured servers, set up around the world to handle any sudden demand for data streaming, helped Reynolds’s stream from the passenger seat of a car go viral almost instantly.
Live-streaming at such a speed and on such a scale raises legal and ethical questions. At least five people this year have been shot while broadcasting with Facebook Live. One, a man in Chicago, was killed. Another man, an apparent sympathizer with Islamic State in Paris, streamed threats after he allegedly murdered a French police commander and his partner. In Milwaukee, two 14-year-olds and a 15-year-old filmed themselves having sex. (Facebook deleted the Paris and Milwaukee videos; the Chicago murder film is still available.)
Videos are routed through a content-moderation system that’s still a work in progress. If any widely viewed live-stream or footage is flagged as inappropriate by a single Facebook user, it’s sent to one of four content-moderating call-center-like operations, in Menlo Park, Austin, Dublin, and Hyderabad, India. Moderators are instructed to interrupt any live-stream that violates Facebook’s community standards, which include bans on threats, self-harm, “dangerous organizations,” bullying, criminal activity, “regulated goods,” nudity, hate speech, and glorified violence. The gatekeepers weigh the public-interest value of a given video against these standards.
“Facebook is in a position of power,” says Jonathan Zittrain, the director of Harvard’s Berkman Klein Center for the Internet and Society. “At some point Facebook will be asked to shut down a live feed to make sure something doesn’t go viral,” he says. The company “needs to be upfront about the decisions it’s making and the pressures under which it’s making them.” The events of the past week have sparked more discussion at Facebook about the company’s role in such situations.
Facebook has said it hopes to use artificial intelligence to help make such split-second judgments, but the technology is a long way off. “You can have filters for certain words, but AI isn’t going to solve what happened in Minnesota,” says Blagica Bottigliero, a vice president at ModSquad, which uses a network of 10,000 contractors worldwide to moderate online content for the NFL and Warner Bros., among others. “You need the judgment of someone looking at the content and bringing in context, and even in those situations they can get it wrong,” Bottigliero says.
The aftermath of the Reynolds video is a case in point in the difficulty of curating newsworthy but violent content. Early on July 7, Facebook took down the video without explanation, then restored it an hour later with an apology and a disclaimer noting its explicit content. This led to news reports citing anonymous sources who claimed the police had deleted the video while Reynolds was in custody. Facebook spokeswoman Andrea Saul sticks with the company’s statement on the issue, that it was a “technical glitch.” A company statement described the incident as “one of the most sensitive situations,” saying: “We’ve learned a lot over the past few months and will continue to make improvements to this experience wherever we can.”
The same day, Zuckerberg addressed the shooting in a Facebook wall post. “The images we’ve seen this week are graphic and heartbreaking,” he wrote. “I hope we never have to see another video like Diamond’s.” In all likelihood, more such video will come, however, and Facebook will again be a news site, whether it wants to be or not.
The bottom line: Facebook has yet to figure out how to moderate the potentially explosive content its 1.65 billion users could live-stream.