As we’ve waited for our first official glimpse at iOS 18, which should emerge next week when Apple previews the new iPhone software at WWDC 2024, one rumor in particular has stood out to me. It is not a laundry list of AI functions that appear to be arriving to your iPhone via this software upgrade. The element that piqued my interest was the claim that Apple intended to make the majority of the AI functions run on your iPhone rather than using cloud servers for additional processing power. That seemed intriguing to me for both performance and privacy reasons.
Oh, and some capabilities will only be available on your iPhone. However, regardless of how much Apple wants to keep things iPhone-centric, others will need to be offloaded to the cloud. Depending on the iPhone you use, the amount of on-device capabilities may be severely limited.
Bloomberg’s Mark Gurman has the most up-to-date information on Apple’s AI intentions for iOS 18, particularly the updated Siri personal assistant, which will power many of these new capabilities. But the bit that stuck with me was this comment from Gurman: “As part of the [iOS 18] rollout, more basic AI tasks will be conducted on devices themselves, while more advanced capabilities will be handled via cloud computing.
That differs from earlier information about on-device capabilities. And, while this does not make iOS 18 any less thrilling, it does make me worry how consistent the experiences would be across different iPhone models.
Which iPhones can run on-device AI?
On-device AI appeals for a variety of reasons. For starters, if something remains on your smartphone, only you will be able to view it – not even Apple will be aware of how you’re utilizing AI. That’s reassuring in an age when it’s impossible not to feel like you’re constantly being watched. Also, on-device AI would complete jobs faster than if your inquiries had to be routed to the cloud, processed, and then returned to you. On-device AI is preferable, if your device has the necessary hardware.
According to Gurman, iOS 18 will be able to distinguish between simple tasks and those that require cloud processing. For older phones, that may be a lot—according to the source, many of iOS 18’s on-device functions would require an A17 chipset or higher.
To put it another way, in order to fully utilize on-device AI in iOS 18, you’ll need an iPhone 15 Pro or iPhone 15 Pro Max. Even last year’s iPhone 15 and iPhone 15 Plus, which are powered by A16 Bionic CPUs, cannot match the processing requirements. It’s also safe to state that the iPhone 16 series will not have any troubles with on-device AI, because all those phones are expected to feature new A18 chipsets.
It’s not uncommon for older phones to miss out on certain iOS updates. For example, iOS 17 introduced gesture-based visual effects to FaceTime calls, however this feature requires at least an iPhone 12. In the case of on-device AI in iOS 18, however, it appears like many phones will be on the outside looking in, especially if iOS 18 compatibility rumors are true and this year’s update works on the same devices as iOS 17. That would include phones dating back to the iPhone XR, iPhone XS, and iPhone XS Max, all released in 2019.
What we need to hear from Apple
A cynic may argue that this is Apple aiming to leverage an iOS update to boost iPhone sales in an era when people are becoming increasingly hesitant to upgrade from their existing handset. Even if that were the major goal here—which I don’t believe it is—I don’t blame Apple for trying to sell more iPhones. They have a multibillion-dollar corporation to manage. However, some in the WWDC audience will be disappointed to learn what is and is not available to them if they choose to keep the same phone.
Disappointment isn’t a deal killer, though, because it appears that AI features will be available to earlier iPhones – it’s just that many of them will require the cloud to function. So it will be up to Apple to clarify what that means. Previously, when it came to new iOS features that didn’t operate on older devices, Apple consigned that information to the fine print of its preview pages. That won’t fly this time: users should understand what’s on-device and what’s being offloaded to the cloud, and Apple should explain how this affects performance.
Apple will want to be clear about the privacy implications of cloud-based AI features, as well as what it plans to do to protect user data. To that end, Gurman’s research states that Apple intends to rely on the Secure Enclave in the Mac CPUs that power cloud servers to protect privacy. According to sources, Apple has no plans to create user accounts and would instead produce reports describing how it safeguards AI-generated data.
It sounds promising, but I’m the type of person who requires Tim Cook to stand in front of a Keynote deck to fully understand how something is supposed to play out. The good news is that we’ll know exactly how AI functions on old and new iPhones work in a short period of time.