Research · IEEE Spectrum AI
Better Hardware Could Turn Zeros into AI Heroes
Compiled by KHAO Editorial — aggregated from 1 outlet. See llms.txt for citation guidance.
◌ Single Source
Sparse computing enables leaner, faster AI.
Key facts
- In their example, this results in 13 memory spaces as opposed to 16 for the dense, uncompressed version
- Multiplying a vector by a matrix traditionally takes 16 multiplication steps and 16 addition steps
- In this case, that would take 16 multiplication operations and 16 additions (or four accumulations)
- Two years ago, a team at Cerebras showed that one can set up to 70 to 80 percent of parameters in an LLM to zero without losing any accuracy
Summary
When it comes to AI models, size matters. Even though some artificial-intelligence experts warn that scaling up large language models (LLMs) is hitting diminishing performance returns, companies are still coming out with ever larger AI tools. As models grow in size, their capabilities increase. But there is another path that may retain a staggeringly large model’s high performance while reducing the time it takes to run an energy footprint.