Researchers built AI models that use less power than a light bulb

  • 📰 DigitalTrends
  • ⏱ Reading Time:
  • 55 sec. here
  • 4 min. at publisher
  • 📊 Quality Score:
  • News: 32%
  • Publisher: 65%

Computing News

Ai,LLM

Recent research out of UC Santa Cruz shows that modern LLMs running billions of parameters can operate on just 13 watts of power without a loss in performance.

The large language models that power today’s chatbots like ChatGPT, Gemini, and Claude are immensely powerful generative AI systems, and immensely power-hungry ones to boot.

Recommended Videos “We got the same performance at way less cost — all we had to do was fundamentally change how neural networks work,” lead author of the paper, Jason Eshraghian, said. “Then we took it a step further and built custom hardware.” They did so by doing away with the neural network’s multiplication matrix.

These matrices are stored on hundreds of physically separate GPUs and fetched with each new query or operation. The process of shuttling data that needs to be multiplied among the multitude of matrices costs a significant amount of electrical power, and therefore money. “From a circuit designer standpoint, you don’t need the overhead of multiplication, which carries a whole heap of cost,” Eshraghian said. And while the team did implement its new network on custom FGPA hardware, they remain confident that many of the efficiency improvements can be retrofitted to existing models using open-source software and minor hardware tweaks. Even on standard GPUs, the team saw a 10 times reduction in memory consumption while improving operational speed by 25%.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 95. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines