In an AI sector feeling the pinch of financial strain, a Spanish startup is making a quiet but significant push. Multiverse Computing, known for compressing large AI models from labs like OpenAI and Meta, is stepping out of the shadows with a dual launch aimed at developers and businesses.
The company has released CompactifAI, a consumer-facing chat app powered by its quantum-inspired compression. The app’s standout feature is a tiny model called Gilda, which can run locally on a device without an internet connection, keeping data private. However, if a phone lacks sufficient memory, the app automatically switches to a cloud model, forfeiting that privacy benefit. With fewer than 5,000 downloads last month, the app appears more a proof-of-concept than a mass-market product.
The real news is the simultaneous launch of a self-serve API portal. This gives enterprises direct access to Multiverse’s roster of compressed models, bypassing major cloud marketplaces. For companies, the appeal is clear: lower compute costs and the potential to deploy AI in isolated or remote environments—think drones or satellites—where connectivity is unreliable.
CEO Enrique Lizaso emphasized the portal offers the transparency and control needed for production use, including real-time monitoring. The business case is strengthening as the performance gap narrows. Multiverse claims its latest compressed model, HyperNova 60B 2602, responds faster and cheaper than the larger OpenAI model it was derived from, particularly for automated coding tasks.
With over 100 clients like Bosch and the Bank of Canada, Multiverse is building a case for efficient, portable AI. After a $215 million raise last year, rumors suggest it's now seeking €500 million in new funding, betting that in 2026, smaller might just be smarter.
Source: TechCrunch