© 2024 Blaze Media LLC. All rights reserved.
Left-leaning Twitter rival Mastodon has child sexual abuse material problem, researchers find
Photo Illustration by Davide Bonaldo/SOPA Images/LightRocket via Getty Images

Left-leaning Twitter rival Mastodon has child sexual abuse material problem, researchers find

Mastodon, a Twitter rival, has a massive child sexual abuse material problem, according to a July study released by Stanford researchers.

The "decentralized" social media platform has recently been growing in popularity among left-leaning individuals looking to ditch Elon Musk's Twitter, now called "X." According to TechCrunch, Mastodon's active users have increased to 2.1 million over the past couple of months.

After Musk took over Twitter, Mastodon's monthly active users shot up to 2.5 million but dropped to 1.7 million shortly after that.

On Friday, Musk shared Twitter's monthly active user count, noting that it reached "a new high" of over 540 million.

Researchers at the Stanford Internet Observatory conducted a two-day test of Mastodon's content and found over 600 pieces of known or suspected child abuse material. CSAM matches were reported to the National Center for Missing and Exploited Children.

Five minutes into reviewing 325,000 posts, the researchers found the first instance of known CSAM. The team also discover 2,000 uses of CSAM-related hashtags. Researchers reported that most of the child abuse material was from networks in Japan, which has "significantly more lax laws" related to the content.

Stanford's report, "Child Safety on Federated Social Media," explained that decentralized platforms such as Mastodon "offer new challenges for trust and safety" because it lacks a "central moderation team" dedicated to "removing imagery of violence or self-harm, child abuse, hate speech, terrorist propaganda or disinformation."

"While Mastodon allows user reports and has moderator tools to review them, it has no built-in mechanism to report CSAM to the relevant child safety organizations," the study stated. "Bad actors tend to go to the platform with the most lax moderation and enforcement policies."

The Washington Post reported that David Thiel, one of the study's researchers, stated, "We got more photoDNA hits in a two-day period than we've probably had in the entire history of our organization of doing any kind of social media analysis, and it's not even close."

"A lot of it is just a result of what seems to be a lack of tooling that centralized social media platforms use to address child safety concerns," Thiel added.

Musk, who has prioritized removing CSAM material from Twitter, responded to the study's findings by stating, "We kicked them off this platform, so they went elsewhere."

Mastodon did not reply to a request for comment, the Washington Post reported.

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?
Candace Hathaway

Candace Hathaway

Candace Hathaway is a staff writer for Blaze News.
@candace_phx →