Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Here's how your old iPhone might get on-device generative AI

On-device gen AI requires loads of RAM, but many iPhones lag behind in this area. So here's how Apple could do it.
By

Published onDecember 22, 2023

Apple iPhone 14 logo
Robert Triggs / Android Authority
TL;DR
  • A recent Apple research paper has detailed its solution for bringing on-device generative AI to iPhones.
  • The approach reduces the need for plenty of RAM, suggesting that this solution could enable on-device AI on older iPhones.

We’re expecting loads of Android flagship phones to offer on-device generative AI capabilities in 2024, but Apple has been pretty quiet in this regard. Now, a new research paper published by the company has revealed how the company could get AI running locally on its iPhones.

The Financial Times spotted an Apple research paper that details a solution for running large language models (LLMs) on devices with limited RAM. The paper reveals how Apple could keep the “model parameters” in storage and bring chunks of them to the device’s RAM as needed, as opposed to loading the entire model into RAM.

“These methods collectively enable running models up to twice the size of the available DRAM, with a 4-5x and 20-25x increase in inference speed compared to naive loading approaches in CPU and GPU, respectively,” reads an excerpt of the paper.

A pathway for old iPhones to get on-device AI?

On-device generative AI benefits from having plenty of RAM, as it offers faster read/write speeds than the storage used in premium phones. Fast speeds are paramount for on-device AI, enabling much quicker inference times as users don’t necessarily have to wait tens of seconds (or more) to get a response or finished result. This all means an on-device AI assistant that could potentially run at conversational speeds, much faster image/text generation, quicker article summaries, and more. But Apple’s solution means you don’t necessarily need plenty of RAM for responsive on-device AI.

Apple’s approach could enable old and new iPhones alike to offer on-device generative AI features, as Apple’s handsets generally offer less RAM than many premium Android phones. For example, the iPhone 11 series only offers 4GB of RAM, while even the vanilla iPhone 15 offers 6GB of RAM.

Apple isn’t the only mobile player working to reduce the footprint of LLMs. Both Qualcomm and MediaTek’s recent flagship processors support INT4 precision to shrink these models. Either way, we’re sure the industry will continue to find new ways to reduce system requirements for on-device AI, potentially enabling even low-end phones to offer these features.

You might like