Facebook Shares Rules for Censoring Violence, Sex, Guardian Says
- Social network outlines policies for removing certain material
- Moderators review potentially offensive images on Facebook
This article is for subscribers only.
Facebook Inc. created a rule book for moderators to use when censoring the posts of its nearly 2 billion users, responding to global criticism for failing to prevent the circulation of images of violence, sex, hate speech and other controversial material, The Guardian reported.
Facebook relies on thousands of human moderators to review potentially offensive posts, including videos of death, violence, sexual material, abuse and threatening speech. The Guardian said it obtained copies of thousands of slides and pictures that Facebook shared with moderators last year as guidelines, and that many moderators feel overwhelmed by the volume of posts that need to be reviewed and confused by apparent contradictions in Facebook’s policies.