There’s a new divide coming for our smartphones, one that could completely change how we view the iPhone versus Android debate that has built a global duopoly—notwithstanding Huawei’s current push to add a third seat to that table.
Google and Samsung have powered ahead of Apple when it comes to smartphone AI, with the latest Pixel 9 almost AI in a box and Galaxy AI dominating Samsung’s marketing messages and device updates. But there’s a serious security and privacy issue that has not yet garnered the attention it warrants—but it will. This looks set to become the next iteration of what happens on your iPhone stays on your iPhone, but adapted for a new era where that isn’t true at all anymore.
Apple is now heralding “groundbreaking privacy and security protections” as it launches Apple Intelligence, which—just like its Android equivalents—will run out of on-device juice and will need to push some of the processing to the cloud. The answer they have come up with is Private Cloud Compute (PCC). This ensures, Apple says, that “personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple… we believe PCC is the most advanced security architecture ever deployed for cloud AI compute at scale.”
That isn’t to say that Google’s and Samsung’s cloud AI are inherently insecure. But the hybrid model that restricts sensitive processing to device-only is not the same as creating an extension of the device enclave in the cloud, one that relies on Apple’s own silicon both sides to ensure the integrity of the setup.
Apple promised from the start that it would enable independent verification of its claims on an ongoing basis, and it has now done exactly that. “Today we’re making resources publicly available to invite all security and privacy researchers — or anyone with interest and a technical curiosity — to learn more about PCC and perform their own independent verification of our claims.”
And they’re backing this up financially. “We’re excited to announce that we’re expanding Apple Security Bounty to include PCC, with significant rewards for reports of issues with our security or privacy claims.” Those significant rewards equate to $1m for “arbitrary code execution with arbitrary entitlements, and lower level bounties for compromises of user data or requests.
As I said when PCC was first announced, “if this works as billed, it could redefine smartphone AI and erect hurdles for [Apple’s] rivals that could be almost impossible to leap. A closed ecosystem of device and cloud silicon, with an almost end-to-end encrypted philosophy applied to any AI queries or data leaving a user’s device, such that it is quasi-anonymized and enclaved and assured to such an extent that an external researcher could provide third-party accreditation.”
What happens next will be fascinating and will define this new space for years to come. Apple says it believes this “is the most advanced security architecture ever deployed for cloud AI compute at scale,” and that “verifiable transparency [is] a unique property that sets it apart from other server-based AI approaches.” As I said where it was announced, “Samsung has no answer to this—its hybrid AI approach seems crude and underwhelming… PCC, in theory at least, redefines the space.”
Now Samsung needs an answer to PCC. As with its clampdown on sideloading and its deployment of a Knox ecosystem to compete with Apple’s equivalent, this needs the same recognition that the table stakes for security and privacy have expanded hugely in recent years. Just look to Android 15, which is primarily a security and privacy update and which, ironically, Samsung has delayed for Galaxy devices.
Samsung is by far the dominant Android OEM and it now has a chance to respond to PCC. But to do so it needs to determine how much of its device AI will be its own and how much will be Google’s. I fear that Google’s cloud-centric AI philosophy, notwithstanding Gemini Nano, will make this tricky to navigate. Apple meanwhile may buy itself the time it needs to catch up on the AI features themselves.