Photo by Leon Neal/Getty Images
© 2024 Blaze Media LLC. All rights reserved.
Elon Musk's social media platform, X, formerly Twitter, plans to hire 100 content moderators to further its goal to crack down on online child sexual abuse material, an X executive stated Saturday, the New York Post reported.
A Friday blog post from X provided an update on the company's efforts to eradicate CSAM from its platform.
"At X, we have zero tolerance for Child Sexual Exploitation (CSE), and we are determined to make X inhospitable for actors who seek to exploit minors. In 2023, we made clear that our top priority was tackling CSE online," the blog post read.
X announced plans to build a "Trust and Safety center of excellence" in Austin, Texas, where it will hire more "in-house" content moderators to ensure CSAM is removed from the platform.
Joe Benarroch, X's head of business operations, stated that the center's "team is currently being built," the Post reported.
CEO Linda Yaccarino wrote in a recent post on X, "We will always do everything we can to keep children and minors safe. @X is an entirely new company, and over the past 14 months we've strengthened all our policies and enforcement to prevent bad actors from distributing or engaging with child sexual exploitation content."
The company aims to complete the center and recruiting by the end of the year.
According to X's latest reporting, the platform suspended 12.4 million accounts last year "for violating our CSE policies."
"This is up from 2.3 million accounts in 2022," the company noted. "Along with taking action under our rules, we also work closely with [National Center for Missing & Exploited Children]. In 2023, X sent 850,000 reports to NCMEC, including our first ever fully-automated report, over eight times more than Twitter sent in 2022."
Before the "fully-automated NCMEC CyberTipline report," employees had to manually create and review reports.
In December, X ramped up its effort to combat CSAM by scanning all uploaded videos and GIFs, Blaze News previously reported.
"Not only are we detecting more bad actors faster, we're also building new defenses that proactively reduce the discoverability of posts that contain this type of content," a press release from the company read. "One such measure that we have recently implemented has reduced the number of successful searches for known Child Sexual Abuse Material (CSAM) patterns by over 99% since December 2022."
The United States Senate Judiciary Committee is slated to hold a hearing on January 31, where lawmakers will discuss social media companies' failure to combat online child sexual exploitation material. Senators will hear from Yaccarino, Meta CEO Mark Zuckerberg, TikTok CEO Shou Zi Chew, Snap CEO Evan Spiegel, and Discord CEO Jason Citron.
Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!
Want to leave a tip?
We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Candace Hathaway is a staff writer for Blaze News.