The Law Makes It Easier to Traffic Teens for Sex
Can a judicial decision be both tragic and correct? Yes, as the U.S. Court of Appeals for the First Circuit showed yesterday when it upheld the dismissal of claims by underage girls who were victims of sex trafficking facilitated by the website Backpage.com.
The court acknowledged that the young women had made “a persuasive case” that the company “tailored its website to make sex trafficking easier.” Yet it applied the federal Communications Decency Act that essentially shields apps or websites from liability for third-party material published using their platforms. The court concluded that the suit against Backpage couldn’t continue.
The law indeed blocks the suit -- and therein lies the tragedy. The core of the young women’s claim was that Backpage intentionally set up its website to enable illegal sex trafficking after Craigslist closed its “adult” advertising in 2010 out of concern that it might be used for exactly that purpose. In particular, the plaintiffs alleged that the website adopted specific design features so that phone numbers could be masked and photos stripped of metadata to make them harder to trace.
Backpage allegedly adopted weak filters that enabled traffickers to signal that teenagers were available for sale. It charged only for adult ads, but provided the option of a paid “sponsored ad” that would include an untraceable photo posted by the advertiser. Photos of the plaintiffs were shown in the sponsored ads, and they were repeatedly trafficked and raped as a consequence.
Technically, no court has ruled on the accuracy of the plaintiffs' claims, and Backpage.com denies the allegations. But the First Circuit made it pretty clear that it found the allegations credible. Maybe Backpage wasn’t actively seeking to advertise illegal activity; but it looks like it was happy to have the income from it.
In the Trafficking Victims Protection Reauthorization Act of 2008, Congress established civil liability for anyone who “knowingly benefits . . . from participation in a venture which that person knew or should have known has engaged in” sex trafficking. That provision would ordinarily cover anyone who knowingly facilitated sex trafficking as the young women allege Backpage did.
The catch is the Communications Decency Act of 1996, which says that “no provider or user of an interactive computer service” can be “treated as the publisher or speaker of any information provided by another information content provider.” Put plainly, the website can’t be held responsible for content posted by a third party. That goes a long way to protect Backpage from liability for ads posted on its site.
The court ruled that the plaintiffs’ claim treated Backpage as the publisher of the ads, and it’s hard to argue with the First Circuit’s conclusion. The essence of the plaintiff’s claim is to hold Backpage responsible for designing its site to enable certain content to be posted there without risk. The design and facilitation of the posting of ads seem like publication, but the content is nonetheless provided by third parties.
It’s highly unlikely that the Supreme Court would grant review of the ruling. Such a request would inevitably be an application by the plaintiffs to correct what they consider the error of the First Circuit; the Supreme Court doesn’t like to do error correction.
But the tragic result shows that congressional action might be appropriate. Assume for the sake of argument that Backpage or a similar website knowingly and intentionally designed their site to profit from the niche business of sex trafficking. Assume further that this could be proven by internal firm documents, not just by a circumstantial case.
Such bad-faith efforts to use privacy to enable illegal conduct shouldn't be protected by Congress. It should be possible to amend the Communications Decency Act to exclude a safe harbor for bad actors who intentionally set out to facilitate horrific conduct.
Admittedly, the point of a safe harbor provision is to give extra protection. It wouldn’t be a good idea to impose liability on firms whose neutral, privacy-protecting platforms are exploited by criminals. It also wouldn’t be good to subject good actors to potentially costly lawsuits every time someone falls victim to abuse of the system.
One solution would be to set an extremely high burden of proof of knowing bad faith. It might even be a good idea to require such proof before a lawsuit can go forward, instead of allowing a circumstantial showing to be enough for plaintiffs to force defendants to provide information. That’s a much higher burden than exists for most lawsuits.
But some remedy is surely needed. Under the First Circuit’s interpretation of the law, a website could set out to enable criminal conduct and still get the protection of the safe harbor. When the crime facilitated is so serious, that’s a bad result.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
To contact the author of this story:
Noah Feldman at email@example.com
To contact the editor responsible for this story:
Francis Wilkinson at firstname.lastname@example.org