Researchers upend AI status quo by eliminating matrix multiplication in LLMs

Posted under: AI technologies
Date: 2024-06-27
Researchers upend AI status quo by eliminating matrix multiplication in LLMs

Researchers claim to have developed a new way to run AI language models more efficiently by eliminating matrix multiplication from the process. This fundamentally redesigns neural network operations that are currently accelerated by GPU chips. The findings, detailed in a recent preprint paper from researchers at the University of California Santa Cruz, UC Davis, LuxiTech could have deep implications for the environmental impact and operational costs of AI systems.

Read more at: arstechnica.com