Thursday, December 19, 2024

There’s more to this year’s smartphones than AI

Must read

Apple revealed its iPhone 16 lineup on Monday, and the big selling point was Apple Intelligence. Apple’s on-device AI system offers splashy features like the ability to rewrite emails, generate custom emoji, and a significantly upgraded Siri. But underneath it all, AI is delivering one other big change to the iPhone: more RAM.

Although Apple never talks about RAM in its smartphones, MacRumors discovered that every iPhone 16 model now has 8GB of RAM, up from 6GB in the base models from last year. And it’s not just Apple making changes like that. Last month, Google made similar changes to its AI-heavy Pixel 9; both the standard and pro models saw an increase in RAM, making 12GB the least you can get this year.

The impetus behind these RAM bumps appears to be artificial intelligence. AI is the year’s new must-have feature, and it’s also incredibly RAM-hungry. Smartphone makers are now bumping memory because they need to — whether they’re saying that out loud or not.

AI models need to be quick to respond when users call on them, and the best way to make that happen is to keep them perpetually loaded in memory. RAM responds far more quickly than a device’s long-term storage; it would be annoying if you had to wait for an AI model to load before you could grab a quick email summary. But AI models are also fairly large. Even a “small” one, like Microsoft’s Phi-3 Mini, takes up 1.8GB of space, and that means taking memory away from other smartphone functions that were previously making use of it.

The Google Pixel 9 lineup gets 12GB or 16GB of RAM, depending on the model, to handle its AI features.
Photo: Allison Johnson / The Verge

You can see how this played out very directly on Pixel phones. Last year, Google didn’t enable local AI features on the standard model Pixel 8 due to “hardware limitations.” Spoiler: it was the RAM. Android VP and general manager Seang Chau said in March that the Pixel 8 Pro could better handle Gemini Nano, the company’s small AI model, because that phone had 4GB more RAM, at 12GB, than the Pixel 8. The model needed to stay loaded in memory at all times, and the implication was that the Pixel 8 would have lost too much memory in supporting the feature by default.

“It wasn’t as easy a call to just say, alright, we’re just gonna enable it on the Pixel 8 as well,” Chau said. Google eventually allowed Gemini Nano onto the Pixel 8, but only for people willing to run their phones in Developer Mode — people who Chau said “understand the potential impact to the user experience.” 

Those tradeoffs are why Google decided to boost RAM across the board with the Pixel 9. “We don’t want the rest of the phone experiences to slow to accommodate the large model, hence growing the total RAM instead of squeezing into the existing budget,” Google group product manager Stephanie Scott said in an email exchange with The Verge

Microsoft bumped the Surface Pro 11’s base RAM to 16GB, the minimum for a Copilot Plus PC.
Photo by Chris Welch / The Verge

So, is all of that extra RAM going just to AI, or will users see improved performance across the board? It’s going to depend a lot on the implementation and just how large those models are. Google, which added 4GB to support local AI features, says you’ll see improvements to both. “Speaking only to our latest Pixel phones,” Scott wrote, “you can expect both better performance and improved AI experiences from their additional RAM.” She added that Pixel 9 phones “will be able to keep up with future AI advances.” But if those advances mean larger models, that could easily mean they’ll be eating up more RAM.

The same RAM-boosting trend is playing out in the laptop world, too. Microsoft dictated earlier this year that only machines with at least 16GB of memory can be considered a Copilot Plus PC — that is, a laptop capable of running local Windows AI features. It’s rumored that Apple is planning to add more RAM to its next generation of laptops, too, after years of offering 8GB of RAM by default. 

The iPhone 16, now with 8GB of RAM, is Apple Intelligence-ready.
Photo: Allison Johnson / The Verge

That extra memory will be needed, especially if laptop makers want to keep even larger models loaded locally. “I think most OSes will keep a LLM always-loaded,” Hugging Face CTO Julien Chaumond told me in an email, “so 6-8GB RAM is the sweet spot that will unlock that in parallel to the other things the OS is already doing.” Chaumond added that models can then load or unload “a small model on top of it to change some properties,” such as a style for image generation or domain-specific knowledge for an LLM. (Apple describes its approach similarly.)

Apple hasn’t explicitly said how much RAM is necessary to run Apple Intelligence. But every Apple device that runs it, going back to the 2020 M1 MacBook Air, has at least 8GB of RAM. Notably, last year’s iPhone 15 Pro, with 8GB of memory, can run Apple Intelligence, while the standard iPhone 15 with 6GB of RAM cannot.

Apple AI boss John Giannandrea said in a June interview with Daring Fireball’s John Gruber that limitations like “bandwidth in the device” and the neural engine’s size would make AI features too slow to be useful on the iPhone 15. Apple VP of software engineering Craig Federighi said during the same appearance that “RAM is one of the pieces of the total.”

The 2GB iPhone 16 RAM bump isn’t ultimately a lot, but Apple has long been slow to expand baseline RAM across its devices. Any increase here feels like a win for usability, even if the company is starting small.

We still don’t know how useful Apple Intelligence will be or whether a slight jump in memory will be enough for today’s iPhones to run tomorrow’s AI features. One thing seems certain, though: we’ll be seeing more of these sorts of hardware bumps as AI proliferates across the industry.

Latest article