After nearly two years of waiting, Apple has finally jumped on the generative artificial intelligence (AI) bandwagon, with the big news announced by the company’s CEO, Tim Cook, at the Worldwide Developer Conference (WWDC). Kicking off the keynote event, Cook announced: “I’m excited that we’ll introduce profound new intelligence capabilities that we hope will inspire developers, delight users and make our platforms even smarter and more useful than ever.”
However, the details that Apple later provided at the keynote and on its website qualified the scope of this technological leap, which will only be available to current iPhone users who have one of the two most powerful models — the 15 Pro and 15 Pro Max — and, in 2024, it will be limited to users who speak and write in U.S. English.
The new Apple Intelligence system and the expected deep revamp of Siri — coming in the fall, and in testing phase, with the new iOS 18 operating system — will sideline well over 90% of current iPhone users, if they don’t buy a new smartphone. In the absence of official data from Apple, estimates indicate that there are about 1.5 billion active iPhone users worldwide, and different analysts estimate that iPhone 15 Pro and Pro Max — the world’s best-selling phones — could number close to 100 million units. This means that Apple Intelligence would not even reach 7% of Apple’s total cell phone users. What’s more, users outside the U.S. will also be excluded from that percentage for now.
Only the iPhone 15 Pro and Pro Max — out of the 24 models compatible with the new iOS 18 — will be able to run Apple Intelligence. This system includes functions that are already offered by other technological giants such as summarizing documents, notes, audios and phone calls, suggesting answers to messages, reviewing texts and correcting style and tone, as well as new ones, such as creating new emojis and images from scratch with a few simple directions. The other iPhones will also not have access to the revamped Siri digital assistant, which will be able to understand requests much better, keep the thread of a conversation with the user and ask ChatGPT to solve any questions it cannot answer.
Beyond phones, Apple Intelligence and the new Siri will also be available on tablets and computers with Apple processors — the M1 or higher. In the case of iPads, this will benefit five of the 15 models compatible with the iPadOS 18 operating system; and 13 of the 18 computer models compatible with macOS 15 Sequoia. Apple began selling Macs with Apple processors in 2020, iPads Pro in 2021 and iPads Air in 2022. All iPad minis and iPad models without these processors will not have the new smart features.
Nor will Apple’s leap to generative AI affect its Vision Pro mixed reality glasses, despite the fact that it is one of the company’s most modern and sophisticated devices and is equipped with an M2 processor, with more than enough capacity.
Promising functions, but not in the short term
The Siri revamp was also long awaited on Apple Watches — where the digital assistant works worse than on iPhones — and on HomePods — which are only voice-controlled with “Hey, Siri.” Despite this, the new Siri will not be coming to smartwatches, smart speakers or the Apple TV video player for the time being either.
Apple has not provided any information on whether the current models of the different platforms that remain outside the new AI system will be able to access some of its functions in the future, either via the processor or by accessing the same private cloud computing network. This network — launched on June 10 — will be used to resolve the most sophisticated generative AI requests. There are also no details on when these features will be available in other languages, such as Spanish. The tech giant merely states that Apple Intelligence “will be available in beta as part of iOS 18, iPadOS 18, and macOS Sequoia this fall in U.S. English. Some features, software platforms, and additional languages will come over the course of the next year.”
The fact that the revamped Siri will not be available on smartwatches or smart speakers has disappointed tech analysts. Before the keynote, former Macworld editor Jason Snell, posted on SixColors: “When I’m running or walking my dog, I generally use just AirPods and my Apple Watch. Theoretically, I’m covered by Siri, but I generally avoid talking to it because it’s unreliable.”
Bloomberg’s Mark Gurman, who received a detailed leak of all the keynote announcements days in advance, predicted that Apple’s AI would have the advantage of being available on all Apple devices. Now he warns that the company’s push into AI will take years to pay off and complete, arguing that the new features “probably won’t do much to supercharge iPhone sales this year.”
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition