Nature-inspired algorithms cut AI energy requirements by over 80% as data centre power demands become a global concern
CALIFORNIA, USA, 9 APR 2026 - The world's biggest technology companies are set to spend nearly $700 billion this year building data centres to power artificial intelligence. Global data centre energy consumption is expected to double by 2028, with AI workloads driving the majority of that growth.
Refiant AI - a startup that uses nature-inspired algorithms to radically compress AI models, slashing the hardware and energy required to run them - today announced a $5 million seed round led by VoLo Earth Ventures, a top-decile climate technology fund with $225 million invested across 35 portfolio companies. The company has already successfully demonstrated it can compress a 120 billion parameter AI model to run on a standard laptop, reducing energy requirements by over 80% while preserving near-identical quality.
"AI's growing energy footprint is one of the most urgent and underappreciated challenges in the climate space," said Sid Gutta, Co-Founder. "The industry's default answer is to build more data centres and consume more power. Ours is to make the AI itself dramatically more efficient."
Radical efficiency
Modern AI models are enormous and growing every quarter, with parameter sizes running into the trillions. Running them requires banks of GPUs, cooling systems, and vast amounts of electricity. For most organisations, using AI means sending data to power-hungry cloud infrastructure operated by a handful of tech giants - with the associated energy cost, carbon footprint and loss of data control.
Refiant's founders believe the more sustainable path is to make current AI radically more efficient. The start-up recently compressed a 120 billion parameter AI model - one of the most powerful open-source models available at the time - to run on a MacBook Pro with just 12GB of RAM. The same model would normally require hardware with at least 80GB of memory. The model retained 95-99% of its fidelity, ran alongside a second AI model on the same machine, and the entire process took four hours with no cloud computing required.
Energy consumption was measured in a Faraday cage to eliminate external electromagnetic interference and ensure accurate readings. Under these conditions, the compressed model achieved approximately 3,000 tokens per kilowatt-hour - up to 100 times more energy efficient than running the same model on conventional data centre hardware. The energy required to process a single AI prompt on standard infrastructure could power roughly 100 equivalent prompts using Refiant's approach.
The implications are major. Replace racks of GPU servers drawing thousands of watts with standard laptops, multiply that across thousands of organisations, and the energy savings become material at grid level. For countries with limited data centre infrastructure, it means the ability to run powerful AI locally rather than exporting compute to Silicon Valley.
Matching Google breakthroughs
When Google unveiled its TurboQuant compression algorithm in late March - reducing AI memory requirements by 6x - it validated what Refiant had already been demonstrating. The seed-stage startup has independently achieved comparable compression ratios, with a fraction of the resources by focusing on model weights and retraining.
"The AI industry is spending hundreds of billions scaling infrastructure when the real breakthrough is the ability to do more with radically less," said Dr Viroshan Naicker, Co-Founder and a mathematician with published research in networks and quantum systems. "Nature doesn't build by brute force. Evolution optimises. We've applied that principle to AI - and the results speak for themselves."
AI and climate: not a trade-off
Businesses today are caught between two competing imperatives: reduce your carbon footprint, and adopt AI to stay competitive. For Refiant, the commercial opportunity and the environmental one are inseparable.
"Those two mandates don't have to be in tension," said Mathew Haswell, Co-Founder. "AI adoption and sustainability commitments can coexist - but only if the technology itself becomes more efficient. Organisations shouldn't have to choose between deploying AI and meeting their energy targets - and they shouldn't have to send their data halfway around the world to do it."
That convergence of efficiency and sustainability played a major role in attracting VoLo Earth Ventures, whose portfolio includes companies building geothermal power plants with Meta and turning fallen trees into carbon-negative construction materials.
"AI's biggest constraint isn't demand - it's energy," said Joseph Goodman, Managing Partner, VoLo Earth. "What's been missing is a fundamentally more efficient way to compute. Refiant's architecture replaces brute-force scaling with a far more efficient, nature-inspired approach that lowers energy use while increasing capability. That's the kind of breakthrough needed to make AI sustainable on a global scale."
Looking ahead
The funding will be used to scale Refiant's team - which already includes a former Google Cloud architect, a Cambridge PhD researcher, and an engineer with NASA experience - to build out a platform and to accelerate enterprise partnerships. The company is in active conversations with several multinational technology firms exploring how Refiant's approach could reduce their AI compute costs while maintaining data and energy sovereignty.
The current breakthrough results were attained at the end of last year and since then, the team have been gearing up to demonstrate successfully exceeding these results with further compression, longer context windows and model traceability.
Media Contact team@refiant.ai