Tech by Blaze Media

© 2024 Blaze Media LLC. All rights reserved.
EU considers privacy busting regulations
Mlenny/Getty

EU considers privacy busting regulations

The European Union (EU) is considering regulation that would force all digital communication platforms to scan for child sexual abuse material (CSAM). Many large tech platforms, like Apple, are already voluntarily (and controversially) conducting CSAM scanning, but the EU would make it mandatory for every service operating within their borders.

In a blow to user privacy, the new rule includes services that currently employ end-to-end encryption (E2EE). E2EE, also known as zero-knowledge encryption, prevents anyone other than the intended users from deciphering the contents of their data. This technology prevents not only companies but also law enforcement from viewing users' communications and files. Mandatory CSAM scanning would effectively outlaw this form of encryption.

No privacy for the innocent

I want to be clear that I absolutely abhor any form of child abuse, sexual or otherwise, and strongly believe that anyone trafficking in related material should be tracked down and brought to justice. However, the reality of CSAM scanning is that it subjects the broader populace to a form of mass surveillance ripe for abuse while proving largely ineffective at rooting out perpetrators.

Cybercriminals, particularly those trading in CSAM, are careful to keep their illegal activity off mainstream, unencrypted platforms. If zero-knowledge encryption is outlawed by CSAM laws, innocent users will be stripped of this privacy protection, while cybercriminals, having already demonstrated their disregard for the law, will continue to operate on anonymous encrypted platforms.

CSAM scanning is also imperfect, producing false positives that require manual review. This review process exposes people's private, and sometimes intimate, images to company employees and members of law enforcement.

To make matters worse, the manual review process isn't infallible either. Last August, the New York Times highlighted two separate cases in which fathers were permanently locked out of their Google accounts after taking pictures of infections on their toddlers' genitals at the request of medical personnel. Even after police reviewed the images and determined them not to be CSAM, Google refused to reinstate the user accounts, disrupting the lives of the men and their families who heavily depended on these Google accounts.

Apple’s CSAM controversy

In August 2021, Apple introduced its plan to scan users’ iCloud Photos data for child sexual abuse material (CSAM). The Cupertino-based company, which publicly prides itself on respecting user privacy, designed its CSAM scanning system to check images' cryptographic hashes against those of known CSAM rather than directly analyzing the content of users' images. However, this plan was too clever by half, as Apple ignored the fact that this technology still performs a kind of scan on users' private photos and could be leveraged in the future to scan for other content that the company or law enforcement deem unacceptable for whatever reason.

After receiving significant negative pushback, Apple delayed its plan to roll out this feature for iCloud Photos, before finally killing the plan altogether late last year. Nonetheless, despite this win for user privacy and data sovereignty, CSAM scanning continues apace on other platforms, including Apple's iCloud email service, as well as messaging and file storage services offered by Google, Microsoft, and Meta.

When Apple announced in December that it would implement E2EE for iCloud Backup and Photos data, an FBI spokeswoman denounced "end-to-end and user-only-access encryption" as a threat, stating that it "hinders [the FBI's] ability to protect the American people from criminal acts ranging from cyberattacks and violence against children to drug trafficking, organized crime and terrorism."

The FBI instead advocates for "lawful access by design," which is a way of saying the FBI wants backdoors into communication and file-sharing services so it can monitor people's activity. If the proposed EU regulation goes into effect, it would require backdoors to be built into all digital communications and cloud storage platforms operating in Europe to facilitate CSAM scanning. These backdoors likely wouldn't remain limited to special European versions of the affected services either.

A potential threat to open-source software

Beyond the imposition of CSAM scanning on all communication platforms, the proposed EU regulation includes a requirement (Article 5) that all providers of software application stores identify users' ages to prevent children from installing software that is deemed at risk of facilitating child sexual abuse. This requirement has led to concern that the regulation would effectively outlaw open-source software repositories and app stores, which exist to freely serve software to users without requiring any form of user identification.

The proposed regulation refers to Article 2 of another regulation for a definition of "software application stores." The listed definition is "a type of online intermediation services, which is focused on software applications as the intermediated product or service." This seems sufficiently broad to include open-source software repositories and app stores, such as Flathub and F-droid.

However, the age verification requirement may not apply to most open-source repositories, as EU regulation defines "services" as those "normally provided for remuneration." Since these repositories are intended to provide software for free, they hopefully would not be considered services subject to any requirements included in the proposed regulation. That said, it isn't clear whether these requirements would apply to open-source code hosting platforms like GitHub and GitLab that offer subscription plans for premium features. Many open-source app stores also pull from these code hosting platforms, so would age verification requirements extend downstream to these app stores as well?

Regardless of whether the proposed regulation constitutes a definite nail in the coffin for open-source software repositories and app stores, the regulation in its current form elicits confusion. Who will be subject to its requirements and how that will be enforced remains murky, which doesn't bode well when the potential consequences are majorly consequential. Meanwhile, the regulation makes clear that all digital communications and cloud storage services will be required to open backdoors for CSAM scanning. Once those backdoors are in place, governments may be unable to resist using them for further surveillance.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?