Silicon Valley is pushing live video as the next big thing.
Facebook, Google and Twitter have in recent months created tools for people to broadcast directly from their smartphones, trumpeting them as a boon for democracy as well as just plain entertaining. Just last week members of the U.S. Congress used their phones to broadcast their sit-in over gun control measures after TV news feeds were cut off.
As ever on these free-to-use services, the goal is to sell more ads. Video is more profitable than search or banner ads.
But the Californian optimism of these companies is now meeting harsh reality: criminals and terrorists using these live tools to broadcast some very horrible stuff. Two weeks ago, for example, an Islamic State sympathizer posted a 13-minute video of himself murdering two police officers in their home in a Paris suburb.
For Facebook, Google and Twitter it's not just a matter of being morally responsible for finding a way to prevent live video from being abused. There's a real commercial imperative to fix it -- but little sign the companies are doing enough.
These companies all need to attract advertisers. Spending on digital ads will overtake television globally next year, and the threshold has already been crossed in the U.K. and U.S. markets.
Key to Facebook, Google and Twitter's efforts is convincing marketers that they can build their brands online; such fuzzy, emotion-inspiring marketing has traditionally been done via 30-second television spots.
But, as one executive responsible for spending millions of dollars marketing a popular skin cream recently told me, there is no way he'd authorize any live video for his brands if it risks being associated with violence or crime. It poisons the environment, as he put it.
To some extent these aren't new problems. Advertisers have long had to navigate their way through objectionable material as they follow people online where they are spending more time compared with print media or television. But live video -- a tool previously only in the hands of big broadcasters subject to regulation -- is different. It adds to the material's immediacy, and makes it much harder to control.
Tech companies currently rely on "community standards" -- rules that lay out what kinds of content is allowed -- as well as their users to flag material that falls out of bounds. Teams of moderators are then responsible for removing the video. In an effort to prevent bullying, Twitter has invested in tools to monitor comments that people post on its Periscope app.
This all falls woefully short. It still took 11 hours for Facebook to remove the recent murder video, according to Agence France Presse.
More needs to be done, and there are few signs that the tech companies are prioritizing this by putting their armies of brilliant coders on the task. Google and Facebook are using automated tools to identify and remove some extremist content from their sites, Reuters reported last week.
Writing algorithms to scan content in real time won't be easy -- or certain of success -- but the companies must try to do even more. They can certainly afford it. Google and Facebook are sitting on cash piles of $80 billion and $21 billion respectively.
Failing a technological fix, more human oversight will be needed. The companies don't disclose how many people they employ to monitor and take down material. (It would be instructive to compare that figure with the size of their ad sales teams.)
Another option may be subjecting live video to a few seconds of delay, like many television broadcasters do as a condition of their broadcasting licenses. The tech companies' argument that delays would dent the free-wheeling appeal of live video is unconvincing.
When Mark Zuckerberg introduced Facebook Live in April, he proclaimed live video to be a "new, raw, personal, spontaneous way people can share". He couldn't have been more right -- sometimes awfully so.
This column does not necessarily reflect the opinion of Bloomberg LP and its owners.
To contact the author of this story:
Leila Abboud in Paris at email@example.com
To contact the editor responsible for this story:
Edward Evans at firstname.lastname@example.org