© 2024 Blaze Media LLC. All rights reserved.
Twitter moves to dismiss child porn lawsuit citing Section 230 immunity
Chris Ratcliffe/Bloomberg via Getty Images

Twitter moves to dismiss child porn lawsuit citing Section 230 immunity

'Given that Twitter's alleged liability here rests on its failure to remove content from its platform, dismissal of the Complaint with prejudice is warranted on this ground alone'

Twitter has filed a motion to dismiss a lawsuit from a minor who claims that the social media platform refused to remove child porn that featured him and another 13-year-old, citing its immunity under Section 230 of the Communications Decency Act.

"Congress recognized the inherent challenges of large-scale, global content moderation for platforms, including the potential for liability based on a platform's alleged 'knowledge' of offensive content if it chose to try to screen out that material but was unable to root out all of it," Twitter's legal team states in a motion to dismiss filed Wednesday with the U.S. District Court for the Northern District of California. "Hoping to encourage platforms to engage in moderation of offensive content without risking incurring potentially ruinous legal costs, in 1996 Congress enacted Section 230 of the Communications Decency Act ('CDA § 230'), granting platforms like Twitter broad immunity from legal claims arising out of failure to remove content."

"Given that Twitter's alleged liability here rests on its failure to remove content from its platform, dismissal of the Complaint with prejudice is warranted on this ground alone," Twitter argues.

The 17-year-old plaintiff, a victim of sex traffickers identified as "John Doe" because he is a minor, alleges in a lawsuit filed in January that Twitter refused to take down pornographic images and videos of him and another teen that surfaced in 2019 because Twitter "didn't find a violation of our policies."

The plaintiff's mother, Jane Doe, allegedly had to contact a U.S. Department of Homeland Security agent who reached out to Twitter and convinced the company to take down the images and video.

The lawsuit accuses Twitter of benefitting from child sex trafficking, failing to report known child sex abuse material, knowingly distributing child pornography, intentionally distributing non-consensually shared pornography, and possessing child pornography, among other complaints.

In the motion to dismiss, Twitter said the plaintiff "appears to have suffered appallingly" at the hands of his sex traffickers, but denies culpability for the images and videos shared on its platform:

This case ultimately does not seek to hold those Perpetrators accountable for the suffering they inflicted on Plaintiff. Rather, this case seeks to hold Twitter liable because a compilation of that explicit video content (the "Videos") was — years later — posted by others on Twitter's platform and although Twitter did remove the content, it allegedly did not act quickly enough. Twitter recognizes that, regrettably, Plaintiff is not alone in suffering this kind of exploitation by such perpetrators on the Internet. For this reason, Twitter is deeply committed to combating child sexual exploitation ("CSE") content on its platform. And while Twitter strives to prevent the proliferation of CSE, it is not infallible.

But, mistakes or delays do not make Twitter a knowing participant in a sex trafficking venture as Plaintiff here has alleged. Plaintiff does not (and cannot) allege, as he must, that Twitter ever had any actual connection to these Perpetrators or took any part in their crimes. Thus, even accepting all of Plaintiff's allegations as true, there is no legal basis for holding Twitter liable for the Perpetrators' despicable acts.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?