New Algorithm Slashes AI Energy Use By 95%

While artificial intelligence (AI) is expanding at a never-before-seen rate, its energy requirements are posing a sustainability risk. BitEnergy AI has created a new algorithm that can cut AI’s energy usage by up to 95% in order to solve this problem. This development could mean the end of nuclear energy use by providing a more environmentally friendly substitute for AI systems that consume a lot of energy.

AI machines typically consume large amounts of energy, with companies exploring advanced energy sources like fusion to meet these needs. For instance, Microsoft recently restarted operations at Three Mile Island to power its AI infrastructure. Studies indicate that the electricity consumption of AI services like ChatGPT averaged 564 MWh per day in early 2023, equivalent to the daily energy use of 18,000 U.S. households.

BitEnergy AI’s new algorithm, Linear-Complexity Multiplication (L-Mul), optimizes AI’s computation process by reducing the complexity of mathematical operations. Most AI models depend heavily on floating-point multiplications, which are precise but energy-intensive. L-Mul tackles this by converting these operations into simpler integer additions, significantly lowering the computational energy costs.

The impact of L-Mul on tensor processing hardware is profound, slashing energy use for element-wise floating-point tensor multiplications by 95% and dot products by 80%. Since integer additions require much less energy than floating-point multiplications, AI computations become faster and more efficient.

Remarkably, L-Mul maintains the accuracy of AI models while offering compatibility with transformer-based models like ChatGPT. It has demonstrated superior performance on language, vision, and mathematics benchmarks, even outperforming 8-bit transformers with lower computational costs and no loss of accuracy.

Despite its promise, L-Mul faces challenges in adoption due to its need for specialized hardware, which is not widely available yet. BitEnergy AI is actively working to develop this hardware and create APIs to integrate L-Mul into existing AI systems. This innovation marks a significant step toward sustainable AI, reducing energy use without compromising performance.

Leave a Reply

Your email address will not be published. Required fields are marked *