Back
Tech Breakthrough: Running LLM's on Your Computer Uses 80% Less Energy
April 30, 2026
Interview

Artificial intelligence and LLM usage is driving a massive surge in global energy demand, but what if there’s another way? In this interview, Mathew Haswell, Co-Founder of Refiant AI, explains a breakthrough that could change everything. Running powerful AI models locally on your laptop or desktop computer with dramatically lower energy use. Instead of relying on massive, energy-hungry data centers, this new algorithmic breakthrough reduces compute and memory requirements by up to 80% while maintaining near-full performance. That means faster, cheaper, and more efficient AI/ML, accessible to businesses and individuals alike.

In the interview we explore:

  • How AI energy demand became a global bottleneck
  • The shift from cloud to edge (local) computing. Why this could disrupt data centers and GPUs
  • What it means for small and medium-sized businesses
  • The future of efficient, scalable AI
  • This is a potential turning point for the economics and environmental impact of AI

Your future starts here