Twitter, Facebook Sued by French Jewish Group Over Hate Speechby and
Jewish group filed legal complaint also targeting YouTube
Lawsuit seeks clarity on how social networks are moderated
A French Jewish youth group sued Twitter Inc., Facebook Inc. and Google over how they monitor hate speech on the web, highlighting the challenge -- and potential costs -- for Internet platforms to regulate user-generated content.
The lawsuit filed at a Paris court by the Jewish youth group called UEJF is seeking more clarity on who moderates social network posts and how it’s conducted. Together with an anti-racism and an anti-homophobia group, UEJF published a report earlier this month showing Twitter, Facebook and Google’s YouTube deleted only a small number of posts flagged as hateful, threatening or promoting violence.
Over the course of about six weeks in April and May, members of French anti-discrimination groups flagged unambiguous hate speech that promoted racism, homophobia or anti-Semitism. Over 90 percent of the posts pointed out to Twitter and YouTube remained online within 15 days on average following requests for removal, according to the study by UEJF, SOS Racisme and SOS Homophobie. Facebook’s screening led to the deletion of a third of the posts.
“There needs to be more moderation, but it’s complicated and expensive,” Gilles Clavreul, the delegate to France’s prime minister on fighting racism and anti-Semitism, said this week on French radio, after the research was published. “There’s only one way for social networks to improve: By hiring more people.”
Pressure is on from governments from Israel to Germany to step up the fight against hate speech as Internet platforms become center stage for everything from political activism to promoting terror. Criticism has amplified since terror attacks in Paris last year, with administrations from the U.K. to the U.S. calling on Silicon Valley companies for help in the fight against terrorism.
Representatives for Twitter and Facebook didn’t immediately respond to requests for comment. A representative for Google in Paris said the company had no comment.
According to the report, the content still online includes a comment on Facebook saying that “homosexuals are disgusting;” a YouTube video that uses a derogatory term for black people and said “go back home you apes;” and a Twitter post that applauded the Brussels attacks and stated “Death to the Jews.”
The platforms all have guidelines that ban users from promoting violence or threatening others on the basis of race, sexual orientation, and religious affiliation. Deciding to delete content isn’t so simple though -- in the end, a human has to review whether a post flagged by users is inappropriate.
While Facebook and YouTube, for instance, have moderators operating globally in multiple languages, 24 hours a day, they face a challenge from the sheer number of their more than 1 billion users. Twitter has 310 million monthly active users according to its website.
“We want answers on how much is invested into moderation and how it’s done -- that’s why we’re going forward with this civil procedure,” said Stephane Lilti, UEJF’s lawyer on this case, who confirmed the complaint had been filed Thursday. “I think there isn’t enough money put in and the moderators aren’t necessarily prepared appropriately for the challenge.”