Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Qualcomm and Meta will bring on-device AI to flagship phones in 2024

This opens the door for advanced virtual assistants, content generation, and more without requiring an internet connection.
By

Published onJuly 19, 2023

Smartphone for Snapdragon Insiders logo light closer up
Robert Triggs / Android Authority
TL;DR
  • Qualcomm has announced that Meta’s Llama 2 AI model is coming to flagship Snapdragon phones in 2024.
  • The chipmaker says you won’t need an internet connection to run Llama 2 on these phones.
  • Meta also confirmed that it’s open-sourcing the AI model.

Qualcomm has already demonstrated AI tech like Stable Diffusion running locally on a Snapdragon 8 Gen 2 smartphone, without the need for an internet connection. Now, the company has announced that next year’s high-end phones will indeed gain on-device AI support.

The chipmaker announced that it will bring on-device AI capabilities to 2024’s flagship phones and PCs, courtesy of Meta’s Llama 2 large language model (LLM). Qualcomm notes that this support will enable a variety of use cases without the need for an internet connection. These mooted use cases include smart virtual assistants, productivity applications, content creation tools, entertainment, and more.

“Qualcomm Technologies is scheduled to make available Llama 2-based AI implementation on devices powered by Snapdragon starting from 2024 onwards,” the company added in its press release.

There’s no word if Meta itself will launch any Llama 2-based apps with local inference on Snapdragon phones next year. But third-party Android app developers will certainly have the tools to release their own efforts.

More Llama 2 news to share

This wasn’t the only Llama 2-related announcement, as Meta revealed that it has open-sourced the LLM too. Meta claims it decided to go open source with Llama 2 to give businesses, startups, entrepreneurs, and researchers access to more tools. These tools would open up “opportunities for them to experiment, innovate in exciting ways, and ultimately benefit from economically and socially.”

According to its press release, it appears Meta believes opening up access to its AI makes it safer. It notes that developers and researchers will be able to stress test the LLM, which will help in identifying and solving problems faster.

The company further explains that Llama 2 has been “red-teamed” — tested and fine-tuned for safety by having internal and external teams “generate adversarial prompts.” Meta adds that it will “continue to invest in safety through fine-tuning and benchmarking” the model.

Finally, Microsoft and Meta also announced an expanded partnership that will see Microsoft becoming a preferred partner for Llama 2. The Redmond company added that the new LLM will be supported on Azure and Windows. Llama 2 is available starting today in the Azure AI model catalog and is optimized to work on Windows locally. However, it will also be available through Amazon Web Services (AWS), Hugging Face, and other providers.

You might like