Apple plans to monitor iCloud images for illegal content Expand

Close

Apple plans to monitor iCloud images for illegal content

Apple plans to monitor iCloud images for illegal content

Apple plans to monitor iCloud images for illegal content

Apple is in the spotlight for its announcement that it will scan photos being uploaded to iCloud for instances of child abuse imagery. But what will it actually do? Is it the start of a new era in monitoring Apple accounts for security reasons? And what does it mean for how you use your iPhone? Here’s a quick guide on the most important bits.

What is Apple doing?

It has put in place a system to check whether photos being uploaded to iCloud from iPhones or Macs or iPads contain illegal child abuse imagery.

Does this mean it is scanning my iPhone?

No. Apple says that it only affects photo uploads to iCloud.

How does it work?

It is looking for exact matches to known child abuse imagery that is already registered as such by US authorities. It is not looking for any new or potential instances of child abuse imagery, just what’s currently on file with the US National Centre for Missing and Exploited Children. As for the technical process involved, Apple has published documentation on its site about this.

Do I need to give my consent for this to happen?

No. This is a feature that Apple has activated for all users.

Does it affect any other upload or photo services from iPhones or Macs?

7 Things: Adrian Weckler on Tech

Tech’s stars and turkeys rounded up and served to you every Friday by Ireland’s No. 1 technology writer.

This field is required

No, just iCloud. It can be avoided by turning iCloud backups off.

Why is Apple not scanning the iPhone as well as iCloud uploads?

Apple says that while it could build a system to scan your entire iPhone, it won’t. Doing that, it said, would cross a much more serious privacy line. By contrast, it justifies the scanning of iCloud uploads by saying that hosting illegal imagery is a much more immediate responsibility for itself than selling a phone that may be used by its owner to take or store such an illegal image.

What do privacy groups say about this?

They’re not happy. "Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products,” said a letter signed thousands of US-based cryptographers, security experts and privacy advocates. Others, such as the US Electronic Frontier Foundation, worry that it is a precedent that now opens they door for Apple to allow governments to require access to people’s phones and computer accounts for security reasons.

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” said the US Electronic Frontier Foundation. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses. That’s not a slippery slope, that’s a fully built system just waiting for external pressure to make the slightest change.”

What do child protection groups say about this?

They’re much more supportive. “We welcome the commitment by Apple to scan all photos uploaded to the iCloud in a bid to combat the collection and distribution of child sexual abuse material,” said Alex Cooney, chief executive of CyberSafeKids, one of the state’s largest child protection organisations. “As a society, we are going to have to find a balance between safety and privacy and this move by Apple seems to be really trying to get that balance right.”

Is this the start of Apple scanning your iPhone for things the government wants them to?

Apple says it has no such intention of doing this, arguing that it has resisted serious attempts by bodies such as the FBI, US government and UK government before to force it to give ‘back door’ access to iPhones on security grounds.

But won’t countries like China now order Apple to give it access by law?

That’s a possibility, although Apple points out that the scanning service is initially rolling out in the US with no plans yet announced for expansion into China or other countries. Some IT security experts argue that Chinese authorities already have a significant access to data attached to iPhones and other online users because of cloud controls they already exert in that country.

Is there a chance that Apple might make a mistake during this process, wrongly accusing someone of harbouring child abuse imagery?

The company says that its systems have a fail rate of around than one in a trillion. Apple will also employ human reviewers to be doubly sure when an image is flagged.

Will those human reviewers be staff or contractors and will any of them be located in Cork?

Apple hasn’t yet given any detail about that.

When does this start rolling out?

Apple says that while it will be part of iOS 15, it may not be introduces immediately, but may be reserved for an update.

Read More