© 2024 Blaze Media LLC. All rights reserved.
Apple will remotely install software to scan all US phones for child sex abuse images
Photo by Justin Sullivan/Getty Images

Apple will remotely install software to scan all US phones for child sex abuse images

U.S. tech giant Apple plans to roll out a remote update that will scan Americans' iPhones for child sexual abuse images, Insider reported.

If found, Insider's Heather Schlitz wrote citing a Thursday report from Financial Times, "Human reviewers would then alert law enforcement if they think the images are illegal."

What are the details?

The Steve Jobs-co-founded technology company plans to roll out software later this year in order to detect child pornography images in an effort to tamp down child sex abuse.

"The software, reportedly called neuralMatch, is designed to look through images that have been stored on iPhones and uploaded to iCloud storage," Schlitz wrote. "According to the Financial Times, if the software detects child sexual abuse in a photo, it will then pass the material on to human reviewers who will alert law enforcement if they think the images are illegal."

Schlitz added, "Researchers told the Financial Times that Apple's decision could pressure other companies into implementing similar kinds of monitoring and could later expand into monitoring of images beyond child sexual abuse, like anti-government signs held at protests."

The Associated Press reported Thursday that the software will reportedly flag the potential match, prompting Apple to disable the user's account. The content — and user — will then reportedly be turned over to the National Center for Missing and Exploited Children.

"Separately," the AP report continued, "Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates."

The AP reported that Apple said the latest changes "will roll out this year as part of updates to its operating software for iPhones, Macs, and Apple Watches."

What else?

Security experts, however, have warned that such move could "open the floodgates to extensive surveillance."

Matthew Green, cryptographer at Johns Hopkins University, said that the technology could be abused.

"Regardless of what Apple's long term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content," Green said, according to the BBC. "Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone."

According to the Associated Press, Green added, "What happens when the Chinese government says, 'Here is a list of files that we want you to scan for.' Does Apple say no? I hope they say no, but their technology won't say no."

In a statement, online civil liberties rights group Electronic Frontier Foundation said that what they referred to as Apple's apparent pivot on privacy protections is a "shocking about-face for users who have relied on the company's leadership in privacy and security."

John Clark, president and CEO of the National Center for Missing and Exploited Children, told the news organization that the company's plan is a "game changer" in protecting children.

"With so many people using Apple products, these new safety measures have a lifesaving potential for children," Clark added.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?