Tech »  Topic »  Liquid AI is revolutionizing LLMs to work on edge devices like smartphones with new ‘Hyena Edge’ model

Liquid AI is revolutionizing LLMs to work on edge devices like smartphones with new ‘Hyena Edge’ model


Liquid AI, the Boston-based foundation model startup spun out of the Massachusetts Institute of Technology (MIT), is seeking to move the tech industry beyond its reliance on the Transformer architecture underpinning most popular large language models (LLMs) such as OpenAI’s GPT series and Google’s Gemini family.

Yesterday, the company announced “Hyena Edge,” a new convolution-based, multi-hybrid model designed for smartphones and other edge devices in advance of the International Conference on Learning Representations (ICLR) 2025.

The conference, one of the premier events for machine learning research, is taking place this year in Vienna, Austria.

New convolution-based model promises faster, more memory-efficient AI at the edge

Hyena Edge is engineered to outperform strong Transformer baselines on both computational efficiency and language model quality.

In real-world tests on a Samsung Galaxy S24 Ultra smartphone, the model delivered lower latency, smaller memory footprint, and better benchmark results compared to a parameter-matched ...


Copyright of this story solely belongs to venturebeat . To see the full text click HERE