© 2024 Blaze Media LLC. All rights reserved.
Pedophiles are trying to use AI to create child sex abuse material on the dark web
Image credit: YouTube screenshot

Pedophiles are trying to use AI to create child sex abuse material on the dark web

An internet watchdog is speaking out about the alarming trend of pedophiles working together online to use open-source artificial intelligence to create child sexual abuse material.

Dan Sexton, the chief technology officer at the Internet Watch Foundation, said that there is a "technical community within the offender space, particularly dark web forums, where they are discussing this technology," adding that these individuals are "sharing imagery, they're sharing [AI] models. They're sharing guides and tips."

The Guardian reported earlier this month that the warning of the online development came after the chair of the U.K. government's AI task force, Ian Hogarth, expressed concerns about child sex abuse material and how open-source models have been used to make "some of the most heinous things out there."

Open-source technology can apparently be downloaded and changed by users. This is not the case with OpenAI's Dall-E or Google's Imagen, which cannot be accessed or changed by members of the general public, according to the report.

Sexton has suggested that pedophiles who are interested in child sex abuse material have taken to the dark web to create and distribute realistic imagery.

"The content that we’ve seen, we believe is actually being generated using open-source software, which has been downloaded and run locally on people’s computers and then modified. And that is a much harder problem to fix," Sexton said. "It’s been taught what child sexual abuse material is, and it’s been taught how to create it."

He noted that online discussions that take place in the dark corners of the internet include publicly available images of children, as well as celebrity children. There are even some cases where victims of child sexual abuse are used as the foundation to create new content, according to Fox News.

"All of these ideas are concerns, and we have seen discussions about them," Sexton added.

A major concern for the creation of AI child sex abuse material is that it could expose a large group of people to the imagery. Christopher Alexander, the chief analytics officer of Pioneer Development Group, said that AI could also be used to help find missing people, potentially using "age progressions and other factors that could help located trafficked children."

"So, generative AI is a problem, AI and machine learning is a tool to combat it, even just by doing detection," Alexander said.

There have been calls for the government to clamp down on the development of such technology before it gets out of hand.

Jonathan D. Askonas, an assistant professor of politics and a fellow at the Center for the Study of Statesmanship at the Catholic University of America, said that "lawmakers need to act now to bolster laws against the production, distribution, and possession of AI-based CSAM [child sex abuse material], and to close loopholes from the pervious era."

While IWF actively searches the web for child sex abuse material and aids in its removal, it could quickly find itself overwhelmed as tips come in about AI-generated material. Sexton said that the prevalence of the material is already widespread across the digital world.

"Child sexual abuse online is already, as we believe, a public health epidemic," Sexton said. "So, this is not going to make the problem any better. It’s only going to potentially make it worse."

IWF CTO, Dan Sexton, on how Social Impact Funding from Nominet allows us to develop new techwww.youtube.com

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?