Multiverse Unveils Record-Breaking Tiny AI Models for Edge Devices

Multiverse Unveils Record-Breaking Tiny AI Models for Edge Devices

Multiverse Computing Releases Ultra-Compact AI Models for Edge and IoT Devices

European AI startup Multiverse Computing has unveiled two of the smallest yet high-performing AI models ever produced, aiming to revolutionize how artificial intelligence is deployed on everyday devices. Named for their minuscule size, the models—SuperFly and ChickBrain—are tailored for edge computing, allowing smart features to run locally on devices like smartphones, wearables, and Internet of Things (IoT) appliances.

Why Tiny AI Models Matter

Traditional AI models are powerful but often require significant computing resources and cloud connectivity. Multiverse Computing’s breakthrough enables advanced AI capabilities—such as chat, speech recognition, and even reasoning—on devices with limited memory and processing power. This could lead to more secure, responsive, and private AI experiences, as data never needs to leave the device.

The Technology Behind the Models

The startup’s core innovation is a quantum-inspired compression algorithm called CompactifAI. According to Multiverse’s co-founder, Román Orús, this technology can shrink AI models dramatically without sacrificing performance. This approach differs from standard model compression methods because it leverages insights from quantum physics, resulting in more subtle and efficient reductions.

Meet the Models: SuperFly and ChickBrain

  • SuperFly: Based on Hugging Face’s open-source SmolLM2 135 model, SuperFly reduces the original 135M parameters to just 94M—roughly the size of a fly’s brain. Designed for simple, on-device tasks, SuperFly can power features like voice-activated commands on home appliances. Multiverse demonstrated its use with an Arduino-powered device, enabling spoken instructions such as “start quick wash” for a washing machine.
  • ChickBrain: A compressed version of Meta’s Llama 3.1 8B model, ChickBrain features 3.2 billion parameters but is compact enough to run efficiently on a MacBook, fully offline. Remarkably, ChickBrain slightly outperforms the original model on several key benchmarks, including language skills, mathematics, and general knowledge assessments like MMLU-Pro, Math 500, GSM8K, and GPQA Diamond.
Multiverse ChickBrain Benchmarks

Image credit: Multiverse Computing

Industry Impact and Use Cases

These ultra-compact models are ideal for embedding intelligence directly into consumer electronics and industrial devices, enabling offline AI functions and reducing reliance on cloud infrastructure. Multiverse reports active discussions with major manufacturers including Apple, Samsung, Sony, and HP—the latter also participated in their recent funding round.

Multiverse’s technology is already used by clients such as BASF, Ally, Moody’s, and Bosch for a variety of machine learning tasks, with compressed AI models available via API on AWS for broader developer access.

Funding and Growth

Driven by its unique compression technology, Multiverse Computing raised €189 million (about $215 million) in June 2025, bringing total funding to around $250 million since its founding in 2019. The company’s success underscores the growing demand for efficient, deployable AI that can operate securely and independently across diverse environments.

References

Read more

Lex Proxima Studios LTD