Skip to main content

Apple hasn’t answered the most important question about its AI features

Apple Intelligence features.
Apple

During the debut of Apple Intelligence at WWDC 2024 yesterday, Senior Vice President of Software Engineering Craig Federighi repeatedly touted the new feature’s security and delicate handling of sensitive user data. To protect user privacy, Apple Intelligence performs many of its generative operations on-device. And for those that exceed its onboard capabilities, the system will transfer the work up to the company’s newly developed Private Cloud Compute (PCC).

However, as Dr. Matthew Green, associate professor of Computer Science at Johns Hopkins University in Baltimore, asked in a thread Monday, Apple’s AI cloud may be secure, but is it trustworthy?

Apple Intelligence promises to empower your iPhone, iPad, and Mac with cutting-edge generative models that can create images, edit your writing, and perform actions on your behalf across any number of apps. Apple devices have already performed machine learning tasks on-device for a number of years; take the camera roll’s search function and optical character recognition (OCR) text recognition for example. However, as Green points out, even the latest generation of Apple processors aren’t yet powerful enough to handle some of the more complex and resource-intensive AI operations coming online, necessitating the use of cloud compute servers.

The problem is, these complex models also need unencrypted access to the user’s data in order to perform inference functions and transmitting that data to a public cloud leaves it at risk of being hacked, leaked, or outright stolen. Apple’s solution was to build its own standalone, hardened data centers specifically for processing Apple Intelligence data: the PCC.  This “groundbreaking cloud intelligence system,” according to a recent Apple Security Blog post, “extends the industry-leading security and privacy of Apple devices into the cloud, making sure that personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple.”

But ensuring that privacy is much harder in practice. “Building trustworthy computers is literally the hardest problem in computer security,” Green wrote. “Honestly ,it’s almost the only problem in computer security.” He commended the company on applying many of the same security features built into its mobile and desktop devices to its new servers, including Secure Boot, “stateless” software, and a Secure Enclave Processor (SEP), as well as “throwing all kinds of processes at the server hardware to make sure the hardware isn’t tampered with.”

Apple has gone to great lengths to ensure that the software running on its servers is legitimate, automatically wiping all user data from a PCC node as soon as the request has been completed and enabling the device’s operating system to “attest” to what software image it’s running.

“If you gave an excellent team a huge pile of money and told them to build the best ‘private’ cloud in the world, it would probably look like this,” Green wrote. “But now the tough questions. Is it a good idea? Is it as secure as what Apple does today,” and can users opt out? It doesn’t appear that users will even opt in to the new service. “You won’t necessarily even be told it’s happening,” he continued. “It will just happen. Magically. I don’t love that part.”

Green goes on to argue that hackers don’t even pose the biggest threat to user data: it’s the hardware and software companies themselves. As such, “this PCC system represents a real commitment by Apple not to ‘peek’ at your data,” Green concluded. “That’s a big deal.”

Apple Intelligence will reportedly begin rolling out later this summer. And in the coming weeks, Apple plans to invite security researchers for a first look at PCC software and the virtual research environment.

Editors' Recommendations

Andrew Tarantola
Andrew has spent more than a decade reporting on emerging technologies ranging from robotics and machine learning to space…
The M3 is the most important chip Apple has ever made
Apple's M3 chip family.

Macs with Apple silicon were a revelation when they launched three years ago, but we haven't seen Apple demonstrate how it will carry that idea forward since then. We saw the M2 release and the middling performance improvements that came with it, but the M3, announced during Apple's "Scary Fast" event, is the most important chip Apple has ever designed for Macs.

Although it carries the branding of the third generation of Apple silicon, it's really the second true generation. There's a lot riding on this release for Apple to prove that it can sustain its own silicon in Macs, and M3 is arriving at a time when there's fierce competition on all sides.
Second-gen Apple silicon

Read more
Everything Apple didn’t announce at its ‘Scary Fast’ launch event
iMac Pro

There’s no doubt that Apple had plenty to unveil at its “Scary Fast” event on October 30, with new devices galore revealed over the show’s duration. For what was an unexpected show, Apple managed to pack in an awful lot.

Despite that, plenty of rumored products never made the cut. That means they could still be in the works, or perhaps were simply dropped by Apple before release. Either way, we’ve rounded up everything that was rumored for the show but never made an appearance -- read on to see what didn’t make it.
M3 MacBook Pro 13-inch

Read more
15 years ago, Apple killed one of its most beautiful designs
An Apple iSight webcam mounted on a computer monitor.

Today marks 15 years since Apple killed its iSight webcam. It was, in my humble opinion, one of the most beautiful products Apple has ever designed. Even among such illustrious company as the iPhone and the MacBook, this little webcam stood out.

Not long after the iSight was launched in 2003, Apple started building webcams into its Macs, which effectively sounded the death knell for iSight so soon after its introduction. Yet look around today, and there’s nothing remotely as well-designed in the webcam world. It was a product of its time -- and I know it’s never coming back -- but that doesn’t mean we shouldn’t mourn its demise.

Read more