Twitter Tries Using Section 230 to Kill Child Porn Lawsuit

(AP Photo/Jose Luis Magana)

Twitter recently came under scrutiny after a lawsuit was filed alleging the social media company insisted on allowing a piece of child pornography to remain on its website. Now, it appears that the platform is trying to use Section 230 to avoid being held accountable for its alleged actions.

Advertisement

The Blaze reported, “Twitter has filed a motion to dismiss a lawsuit from a minor who claims that the social media platform refused to remove child porn that featured him and another 13-year-old, citing its immunity under Section 230 of the Communications Decency Act.”

In the motion, Twitter’s lawyers argue, “Congress recognized the inherent challenges of large-scale, global content moderation for platforms, including the potential for liability based on a platform’s alleged ‘knowledge’ of offensive content if it chose to try to screen out that material but was unable to root out all of it.”

The motion continued: “Hoping to encourage platforms to engage in moderation of offensive content without risking incurring potentially ruinous legal costs, in 1996 Congress enacted Section 230 of the Communications Decency Act (‘CDA ยง 230’), granting platforms like Twitter broad immunity from legal claims arising out of failure to remove content.”

“Given that Twitter’s alleged liability here rests on its failure to remove content from its platform, dismissal of the Complaint with prejudice is warranted on this ground alone,” Twitter asserted.

RedState’s Nick Arama initially reported that the plaintiff, a 17-year-old referred to as “John Doe” because he is a minor, is suing the social media company for refusing to take down a sexually explicit video in which he was molested at the age of 13. She wrote: “Sex traffickers tricked a 13 year old into providing them with sexually explicit pictures and videos including him performing sex acts. The videos then subsequently showed up on Twitter in 2019 under two accounts that shared child sex material.”

Advertisement

She also noted, “The child became aware of the videos in January as did his classmates, and he became the subject of teasing, harassment, vicious bullying and led him to become suicidal.”

The suit alleges that Twitter, despite being informed that John Doe was only 13-years-old, refused to remove the images and videos showing him and another teen because it “didn’t find a violation of our policies.”

The lawsuit also explains that John Doe’s mother had to contact the Department of Homeland Security (DHS) who contacted the company and persuaded them to remove the video and images. According to The Blaze, “the lawsuit accuses Twitter of benefitting from child sex trafficking, failing to report known child sex abuse material, knowingly distributing child pornography, intentionally distributing non-consensually shared pornography, and possessing child pornography, among other complaints.”

In the motion, Twitter’s lawyers argue that:

This case ultimately does not seek to hold those Perpetrators accountable for the suffering they inflicted on Plaintiff. Rather, this case seeks to hold Twitter liable because a compilation of that explicit video content (the “Videos”) was โ€” years later โ€” posted by others on Twitter’s platform and although Twitter did remove the content, it allegedly did not act quickly enough. Twitter recognizes that, regrettably, Plaintiff is not alone in suffering this kind of exploitation by such perpetrators on the Internet. For this reason, Twitter is deeply committed to combating child sexual exploitation (“CSE”) content on its platform. And while Twitter strives to prevent the proliferation of CSE, it is not infallible.

But, mistakes or delays do not make Twitter a knowing participant in a sex trafficking venture as Plaintiff here has alleged. Plaintiff does not (and cannot) allege, as he must, that Twitter ever had any actual connection to these Perpetrators or took any part in their crimes. Thus, even accepting all of Plaintiff’s allegations as true, there is no legal basis for holding Twitter liable for the Perpetrators’ despicable acts.

Advertisement

Without knowing the ins and outs of federal law as it pertains to this case, it is not easy to determine whether or not a judge would grant Twitter’s motion. However, the company seems to be arguing that its failure to identify and remove the video is protected under Section 230.

Still, it is worth noting that the plaintiff’s main point of contention does not seem to be the fact that the company did not find and delete the videos and image. Rather, it seems John Doe’s complaint centers on the allegation that once the pornographic material was brought to Twitter’s attention, the company refused to remove it.

Either way, if the allegations are true, Twitter’s reputation, which is already on shaky ground, could take a serious hit. Many might have the perception that the organization is more concerned with minimizing conservative opinions on its platform than rooting out content that is genuinely harmful.

Even the activist media would not be able to protect the company if it is proven that it allowed child pornography to remain on its site even after being informed of its existence. Regardless of what Twitter might want us to think, this issue isn’t going away anytime soon.

Recommended

Join the conversation as a VIP Member

Trending on RedState Videos