Earlier in August, Apple unveiled a controversial plan to scan user photos for child abuse images. Now, the Electronic Frontier Foundation is fighting back with a petition addressed to Apple.

The update will involve scanning user images for Child Sexual Abuse Material (CSAM) on-device by matching the photos up with known CSAM image hashes. 

If a match is found, Apple will create a cryptographic safety voucher and upload that to the user’s iCloud account alongside the image. This will result in the user’s account being frozen and the images reported to the National Center for Missing and Exploited Children (NCMEC), who can then alert US law enforcement agencies. 

Apple is also rolling out safety tools in iMessage which will detect if an inappropriate image has been sent to a child. iMessage will then blur the image and warn the child before asking if they still want to view it. 

If a parent opts into certain parental settings, they’ll also be alerted if the child chooses to view the image. The same process applies if a child attempts to send an explicit image. 

The update has been met with criticism by privacy advocates and rivals alike, with WhatsApp CEO Will Cathcart calling it an “Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.” 

Now, the Electronic Frontier Foundations (EFF) – a non-profit organisation dedicated to defending civil liberties in the digital world – has started a petition urging Apple not to scan phones. 

You might like…

“Apple has abandoned its once-famous commitment to security and privacy,” writes EFF in the description of the petition. “The next version of iOS will contain software that scans users’ photos and messages. Under pressure from U.S. law enforcement, Apple has put a backdoor into their encryption system.”

EFF also warns that Apple could be pressured into expanding the system to search for additional types of content. 

“The system will endanger children, not protect them—especially LGBTQ kids and children in abusive homes. Countries around the world would love to scan for and report matches with their own database of censored material, which could lead to disastrous results, especially for regimes that already track activists and censor online content.”

Trusted Reviews has reached out to both EFF and Apple for comment.