© 2024 Blaze Media LLC. All rights reserved.
Over 20 million online child sexual abuse material incidents reported on Facebook, by far the most of all platforms
Dan Kitwood/Getty Images

Over 20 million online child sexual abuse material incidents reported on Facebook, by far the most of all platforms

Facebook unveils new tools to combat child sexual abuse material

A new report from the National Center for Missing and Exploited Children said that a vast majority of reported online child sexual abuse material was from Facebook.

The annual report by the NCMEC for 2020 claimed the organization's CyberTipline received more than 21.7 million reports of online child exploitation, 21.4 million of these reports were from electronic service providers. There were 20,307,216 reported incidents related to child pornography or trafficking on Facebook, including Instagram and Whatsapp, which the social media behemoth owns.

For comparison, Google reported 546,704 incidents of CSAM, Snapchat found 144,095, Microsoft had 96,776, Twitter cited 65,062, TikTok had 22,692, and Reddit reported 2,233 instances of apparent child sexual abuse material.

MindGeek, the Canada-based parent company of several adult content websites, reported far fewer incidents. MindGeek, which owns Pornhub, YouPorn, RedTube, and Brazzers, reported 13,229 instances of child sexual abuse material last year.

The Internet Watch Foundation, which helps "victims of child sexual abuse worldwide by identifying and removing online images and videos of their abuse," claimed it found 118 incidents of videos containing child sexual abuse or rape on Pornhub between 2017 and 2019.

In December, Pornhub faced scrutiny after the New York Times published multiple allegations of sexual exploitation on the adult content website.

The streaming behemoth, which netted 3.5 billion visits per month in 2019, introduced new guidelines in December to protect against underage porn being uploaded on the site.

"Going forward, we will only allow properly identified users to upload content," Pornhub said in a statement. "We have banned downloads. We have made some key expansions to our moderation process, and we recently launched a Trusted Flagger Program with dozens of non-profit organizations."

Pornhub also noted that it voluntarily registered as an electronic service provider for the National Center for Missing and Exploited Children's data collection.

Regarding Facebook's overwhelming majority of the alleged CSAM incidents, the National Center for Missing and Exploited Children stated:

Higher numbers of reports can be indicative of a variety of things including larger numbers of users on a platform or how robust an ESP's efforts are to identify and remove abusive content. NCMEC applauds ESPs that make identifying and reporting this content a priority and encourages all companies to increase their reporting to NCMEC. These reports are critical to helping remove children from harmful situations and to stopping further victimization.

As of April 2020, Facebook was the most popular social media platform with nearly 2.5 billion active users.

The NCMEC said reports to the CyberTipline increased by 28% from 2019.

"The 21.7 million reports of child sexual exploitation made to the CyberTipline in 2020 included 65.4 million images, videos and other files," the NCMEC said. "These materials contained suspected child sexual abuse material (CSAM) and other incident related content."

Reports to the CyberTipline by the public more than doubled in 2020.

The numbers from NCMEC are reported instances and are not confirmed cases of abuse.

The NCMEC's CyberTipline is a "centralized reporting system for the online exploitation of children," where the public and ESP's "can make reports of suspected online enticement of children for sexual acts, extra-familial child sexual molestation, child pornography, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet."

Ahead of the NCMEC's report, Facebook announced on Tuesday that it was introducing new measures to prevent "people from sharing content that victimizes children," as well as new improvements to detection and reporting inappropriate content.

"To understand how and why people share child exploitative content on Facebook and Instagram, we conducted an in-depth analysis of the illegal child exploitative content we reported to the National Center for Missing and Exploited Children (NCMEC) in October and November of 2020," Facebook said in a statement.

"We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period," the social media network stated. "While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many."

Facebook reported 150 accounts to the NCMEC for "uploading child exploitative content in July and August of 2020 and January 2021," and found that over 75% of these users "did not exhibit malicious intent." "Instead, they appeared to share for other reasons, such as outrage or in poor humor."

Facebook will now have a "pop-up" that appears whenever users searches for terms associated with child exploitation. There will also be a "safety alert that informs people who have shared viral, meme child exploitative content about the harm it can cause and warns that it is against our policies and there are legal consequences for sharing this material."

Facebook said accounts that share and promote CSAM would be removed.

"Using our apps to harm children is abhorrent and unacceptable," Facebook's news release read. "Our industry-leading efforts to combat child exploitation focus on preventing abuse, detecting and reporting content that violates our policies, and working with experts and authorities to keep children safe."

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?