Large language models are powerful, but generally they require vast computing resources, which means they typically have to run on stacks of high-end GPUs in data centers. Now, startup Multiverse Computing has created models it says are comparable in size to the brains of chickens and flies—allowing the company to shrink powerful LLMs so that they can run on home appliances, smartphones, or cars.
Multiverse, based in Donostia, Spain, is working at the intersection of two of technology’s most in-vogue fields—AI and quantum computing. The company’s flagship product is a software platform called Singularity, designed to allow nonexperts to work with quantum algorithms, but it has also developed compression technology called CompactifAI for shrinking neural networks.
The software relies on tensor networks—mathematical tools originally developed to simulate quantum systems on classical hardware. But their ability to distill complex multidimensional systems into something more compact and easier to work with also makes them a promising avenue for compressing large AI models.
Multiverse’s Nano Models Shrink AI
Multiverse has now used CompactifAI to create a new family of “nano models” that it calls Model Zoo, with each one named after the animal whose brain (theoretically) has a comparable amount of processing power. The first two releases are a compressed version of Meta’s Llama 3.1 model dubbed ChickenBrain, which can bring reasoning capabilities to a Raspberry Pi, and a version of the open-source model SmolLM2 135M small enough to run on a smartphone, dubbed SuperFly.
“SuperFly is a 94-million-parameter model, which is tiny. It’s definitely one of the smallest LLMs out there,” says Sam Mugel, Multiverse’s chief technology officer. “Any device that’s expensive enough that you could justify putting a Raspberry Pi in would be able to host an LLM like SuperFly.” That means expensive electronics like a washing machine or a fridge could now have AI capabilities they would otherwise not be able to incorporate.
The company says this could bring AI capabilities to a wide range of appliances, and in particular the ability to control devices using natural language. Being able to run LLMs locally rather than via the cloud has a host of benefits, says Mugel, including significantly reduced latency and fewer security and privacy risks due to data being processed on-device.
They could be particularly useful for applications where Internet connections may be unreliable, Mugel says. SuperFly is small enough to be directly embedded in a car’s dashboard, which could allow uninterrupted natural-language control even while driving through tunnels or in areas with poor network coverage.
Compressing models is standard practice these days, thanks to growing concerns around the energy and hardware footprints of the largest models. Neural networks are surprisingly inefficient learners and contain a lot of redundant information, says Mugel,…
Read full article: Edge AI Powers Tiny Models for Smart Devices

The post “Edge AI Powers Tiny Models for Smart Devices” by Edd Gent was published on 09/03/2025 by spectrum.ieee.org
Leave a Reply