There are two questions the average person will have about Apple’s new move to scan iPhones for illegal images.
s this the start of a new Big Brother era? And is Apple out of step with mainstream public and civic opinion?
1. The Big Brother threat
The main fear around this Apple move is that our phones will now start scanning our stuff to make sure we’re good citizens.
While this seems a little far off, you can understand why the question arises. What Apple has done here is fundamentally different to other kinds of safety scanning from the likes of Google and Facebook. Instead of checking stuff that’s posted online, as sometimes happens, this method reaches down into the phone itself. Sure, there’s strong encryption to protect against your general content being parsed. But it seems like a new line has been crossed. And it raises questions about where it might lead in future. Will it soon do the same for terrorist content? Will different countries now want different security checks performed on phones?
To be clear, this system only kicks in if you’re using iCloud to upload photos. So Apple won’t activate it on your phone if you turn this upload feature off. And it’s a US-only roll-out for now.
But as many have pointed out over the last week, these are merely policy decisions on Apple’s part. It is showing the world that it can and will scan your iPhone for the right security reason, even if its encryption standards are substantial.
Is it likely that Apple, over time, will expand the reasons why it will scan your iPhone for unlawful activity?
This is difficult to assess because it relates closely to evolving public, political and social mores.
Apple has spent the last week talking at length to journalists and policy makers about its overall commitment to privacy and why it is antithetical to its ethos to trade away or give up its customers’ private information. It has fairly pointed out that it, alone, has resisted threats and political campaigns by the FBI, the US government and the UK government who wanted access to terrorists’ iPhones. Tim Cook has even gone as far as to suggest that Apple would exit a market if it was forced into providing authorities with a backdoor into systems such as iMessage. So the company is saying that this isn’t the start of some new era of snooping or doing the police’s work for them.
Can we believe this? That’s not for me to say, other than to point out that the company, in general, has a decent track record on the issue in western countries (though not as staunchly in China, where it has complied with a Chinese government request to have data hosted on Chinese servers).
And there’s another element: Apple cannot predict what society will demand of it in years to come. It’s not hard to imagine that a spate of particularly abhorrent terrorist attacks, for example, will lead to more pressure on companies like Apple to find some way, maybe using its new-found encrypted scanning message, to root out images, files or other kinds of really bad content.
2. Is Apple going against mainstream public or political societal opinion?
No. This question seems a little easier to answer. Despite the important privacy and Big Brother issues raised above, Apple’s move seems very unlikely to lead to any meaningful pushback from either its own customers, regulators or anyone else.
Yes, a protest letter was signed by thousands of privacy and security professionals criticising this move. Yes, privacy organisations are concerned. And as I’ve already argued, there are real reasons to keep a close eye on how this develops.
But child abuse imagery is an issue that transcends a lot of the usual debates around safety versus privacy. I’m not saying Apple’s move to get on to your device for the purpose of (encrypted) scans isn’t concerning. Merely that the majority of the public, or civic leadership in western countries, is unlikely to think that this is a step too far in the context of what it is trying to remedy.
Ultimately, Apple is not separate to society. Regardless of what its principles on privacy or security are, it will eventually tweak them to meet what is a majority mainstream view. This move probably reflects that reality.
Tech companies are subject to societal calls around giving up more information for safety and security reasons all the time. In most cases, privacy considerations stop them from doing so. (In some instances, it’s actually regulators or laws in the EU that do it.)
Apple thinks it has found a way to steer a middle ground. For now, the public probably isn’t too upset about it. Time will tell whether this remains the case.
Read More