Apple has delayed the launch of its Apple Intelligence AI features in the EU, citing security and privacy concerns when complying with the Digital Markets Act.
But Apple’s latest AI move has raised multiple questions. Why has the iPhone maker delayed the launch of Apple Intelligence in such a large market, are its privacy and security concerns valid—and what else may have prompted its decision?
What Happened?
On June 21, Bloomberg reported that the iPhone maker would delay its iPhone Mirroring, SharePlay Screen Sharing enhancements and Apple Intelligence AI features in the EU until 2025. The reason for this, according to Apple’s statement, is the DMA stipulation for interoperability would compromise iPhone security and privacy.
Available on the iPhone 15 and new devices set to launch in a few months, the AI features will still launch in the US this Fall. “We are concerned that the interoperability requirements of the DMA could force us to compromise the integrity of our products in ways that risk user privacy and data security,” Apple said.
Wait, This Sounds Familiar…..
Sound familiar? It is. Earlier this year, Apple complained that the EU DMA rules which stipulated it had to open its App Store up to sideloading for the first time were a risk to privacy and security. The iPhone maker also disabled progressive web apps (PWAs) in the EU—a move that developers complained about. In the end, the EU said the way Apple handled PWAs did not break DMA rules, leading Apple to U-turn the decision.
Why Did Apple Pull Its AI Features in the EU?
Apple says interoperability with other products would put iPhone users at risk. Apple’s concerns about interoperability rules imposed by the DMA “seem to be legitimate,” security researcher Tommy Mysk says. “It is clear that Apple Intelligence is going to favor Apple products. It’s expected to be more integrated into Apple Music, for example.”
There is a risk that the DMA might force Apple to allow similar integrations with other music services, such as Spotify, Mysk says. This could open up risks because Apple loses control over security when handing responsibility over to another company.
And the same applies to iPhone Mirroring. “The DMA might force Apple to support Windows— it could say iPhone users who use Windows should also be able to use iPhone Mirroring on their PCs,” says Mysk.
But What About Data Protection Regulation?
Other tech giants including Meta and Google have delayed the launch of their AI offerings in the EU, saying data protection regulation the GDPR does not allow the level of functionality they need. AI modes need huge amounts of data to operate and this could be in breach of the stringent EU data protection regulation. Would Apple’s AI product come under similar scrutiny?
The iPhone maker has emphasized the privacy and security credentials of its AI offering, saying its private cloud compute (PCC) data processing model is one of a kind.
But even so, AI tools are “tricky” when it comes to data privacy, Musk says. “Google and X delayed the launch of their AI tools in the EU, so it was expected that Apple would also be cautious,” Mysk says. “Generative AI technologies require data for training as well as acquiring more data for training from users. The GDPR will have many questions about this, specifically, the right to be forgotten.”
And of course, Apple has partnered with OpenAI to include ChatGPT features on iPhones, which has raised more concerns about data privacy.
I contacted Apple for a comment and will update this article if the iPhone maker responds.
Many people will be disappointed about the delay—especially EU users who want to upgrade their iPhones to take advantage of the new features. Apple will certainly make sure they can access these features, but the iPhone maker needs to ensure it’s a cut above its rivals for security and privacy. After all, that is what Apple says is its promise and unique selling point in AI.