Meta just beat Google and Apple in the race to put powerful AI on phones

Meta just beat Google and Apple in the race to put powerful AI on phones

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. More information


Metaplatforms has created smaller versions of its Llama artificial intelligence models that can run on smartphones and tablets, opening up new possibilities for AI beyond data centers.

The company announced compressed versions of its Lama 3.2 1B And 3B models Today they run up to four times faster while using less than half the memory of previous versions. According to Meta’s testing, these smaller models perform almost as well as their larger counterparts.

The advancement uses a compression technique called quantizationwhich simplifies the mathematical calculations that power AI models. Meta combined two methods: Quantization-aware training with LoRA adapters (QLoRA) to maintain accuracy, and SpinQuant to improve portability.

This technical achievement solves a key problem: running advanced AI without massive computing power. Until now, advanced AI models required data centers and specialized hardware.

Testing on OnePlus 12 Android phones showed that the compressed models were 56% smaller and used 41% less memory, while processing text more than twice as fast. The models can process texts of up to 8,000 characters, enough for most mobile apps.

Meta’s compressed AI models (SpinQuant and QLoRA) show dramatic improvements in speed and efficiency compared to standard versions when tested on Android phones. The smaller models work up to four times faster and use half the memory. (Credit: Meta)

Tech giants are racing to define the mobile future of AI

The release of Meta intensifies a strategic battle between tech giants to control how AI works on mobile devices. While Googling And Apple Taking a careful, controlled approach to mobile AI – and keeping it tightly integrated with their operating systems – Meta’s strategy is clearly different.

See also  Samsung Galaxy Ring 2: Rumors about release date, price and specifications

By open sourcing these compressed models and collaborating with chip makers Qualcomm And MediaTekMeta bypasses traditional platform gatekeepers. Developers can build AI applications without waiting for Google’s Android updates or Apple’s iOS features. This move reflects the early days of mobile apps, when open platforms dramatically accelerated innovation.

The partnerships with Qualcomm And MediaTek are particularly important. These companies power most of the world’s Android phones, including devices in emerging markets where Meta sees growth potential. By optimizing its models for these widely used processors, Meta ensures that its AI can work efficiently on phones across price ranges – and not just on premium devices.

The decision to distribute through both Meta’s Llama website And Hugging facethe increasingly influential AI model hub, demonstrates Meta’s commitment to reaching developers where they already work. This dual distribution strategy could see Meta’s compressed models become the de facto standard for mobile AI development TensorFlow And PyTorch became standards for machine learning.

The future of AI in your pocket

Meta’s announcement today points to a larger shift in artificial intelligence: the transition from centralized to personal computing. While cloud-based AI will continue to perform complex tasks, these new models suggest a future where phones can process sensitive information privately and quickly.

The timing is significant. Technology companies are facing increasing pressure around data collection and AI transparency. Meta’s approach – opening up these tools and running them directly on phones – addresses both issues. Your phone, not a remote server, will soon be able to perform tasks such as document summarization, text analysis, and creative writing.

See also  Google Pixel Watch 3: release date, price and specifications

This reflects other crucial shifts in the computing world. Just as processing power shifted from mainframes to personal computers, and computers shifted from desktops to smartphones, AI appears poised for its own transition to personal devices. Meta’s bet is that developers will embrace this change and create applications that combine the convenience of mobile apps with the intelligence of AI.

Success is not guaranteed. These models still need powerful phones to work properly. Developers must weigh the benefits of privacy against the raw power of cloud computing. And Meta’s competitors, especially Apple and Google, have their own vision for the future of AI on phones.

But one thing is clear: AI is breaking free from the data center, one phone at a time.


Source link