Multiverse Computing Secures $215M to Slash AI Costs With New Tech

▼ Summary
– Multiverse Computing raised €189 million in Series B funding for its quantum-inspired compression technology, CompactifAI, which reduces LLM sizes by up to 95% without performance loss.
– The company offers compressed versions of open-source LLMs like Llama and Mistral, with plans to release DeepSeek R1, but does not support proprietary models like OpenAI’s.
– Multiverse’s “slim” models are available via AWS or on-premise licensing, offering 4x-12x speed improvements and 50%-80% cost reductions compared to uncompressed versions.
– Some of Multiverse’s models are small and efficient enough to run on devices like PCs, phones, drones, and Raspberry Pi, enabling broader accessibility.
– The startup, co-founded by quantum computing expert Román Orús and former banker Enrique Lizaso Olmos, has 160 patents, 100 customers, and has raised $250M to date.
Spanish quantum computing startup Multiverse Computing has secured €189 million ($215 million) in Series B funding to advance its groundbreaking AI compression technology. The company’s proprietary CompactifAI system dramatically shrinks large language models while maintaining performance, potentially revolutionizing how businesses deploy AI solutions.
At the core of Multiverse’s innovation is a quantum-inspired compression technique that reduces LLM sizes by up to 95% without sacrificing accuracy. The company currently offers optimized versions of popular open-source models including various Llama iterations and Mistral Small. While proprietary models like those from OpenAI aren’t supported, Multiverse plans to expand its offerings with additional open-source and reasoning models in coming months.
These compressed “slim” models deliver 4x to 12x faster performance compared to standard versions, translating to 50-80% lower inference costs. For example, their compressed Llama 4 Scout model processes a million tokens for just 10 cents on AWS, versus 14 cents for the uncompressed version. The technology’s efficiency enables deployment on everyday devices from smartphones to Raspberry Pi systems, opening possibilities for edge computing applications.
The technical foundation comes from co-founder Román Orús, a physics professor specializing in tensor networks – mathematical structures that simulate quantum computing effects on classical hardware. His research forms the basis for Multiverse’s model compression approach. CEO Enrique Lizaso Olmos, a mathematician and former banking executive, brings commercial expertise to scale the technology.
Investor confidence appears strong, with the funding round led by Bullhound Capital, known for backing major tech successes like Spotify and Revolut. Other participants include HP Tech Ventures, Santander Climate VC, and Toshiba among others. The company reports 160 patents and 100 global customers including energy giant Iberdrola and automotive leader Bosch.
With total funding now reaching $250 million, Multiverse is positioned to expand its compression technology across industries. The ability to run sophisticated AI models on everyday hardware could significantly lower barriers to adoption while reducing the environmental impact of large-scale AI deployments. As businesses increasingly seek cost-effective AI solutions, Multiverse’s technology offers a compelling alternative to traditional resource-intensive approaches.
(Source: TechCrunch)