Site icon Wonderful Engineering

This New AI System Uses 100x Less Power And Still Outperforms Today’s Models

Researchers have developed a new artificial intelligence approach that could significantly reduce energy consumption while improving performance on complex tasks. The system, based on neuro-symbolic AI, combines traditional neural networks with rule-based reasoning, offering a more efficient alternative to current AI models.

The work comes as concerns grow over the energy demands of AI systems, particularly those used in large-scale data centers. Facilities supporting advanced models, including those operated by Microsoft and OpenAI, consume substantial amounts of electricity. In 2024, AI and data centers used approximately 415 terawatt hours of power in the United States, with demand expected to increase further, according to a report by ScienceDaily.

The newly developed system is based on a hybrid framework known as neuro-symbolic AI. Unlike conventional models that rely primarily on statistical learning from large datasets, this approach integrates symbolic reasoning, allowing the system to apply structured rules and logic when solving problems. The research was led by Matthias Scheutz and is scheduled to be presented at the International Conference on Robotics and Automation in Vienna.

The system focuses on visual-language-action models, which are used in robotics to interpret visual input, process language instructions, and execute physical tasks. These models typically require extensive trial-and-error learning, which increases both training time and energy consumption. Errors can also arise from misinterpreting visual data, leading to inefficient or incorrect actions.

By incorporating symbolic reasoning, the new model reduces reliance on trial-and-error processes. Instead of learning solely from patterns, it applies predefined rules related to object properties and task structure. This allows it to arrive at solutions more efficiently and with fewer mistakes.

In testing, the system demonstrated substantial improvements in both accuracy and efficiency. Using the Tower of Hanoi problem as a benchmark, the neuro-symbolic model achieved a 95 percent success rate, compared with 34 percent for conventional systems. When presented with a more complex version of the task, the model maintained a 78 percent success rate, while standard models failed to complete the challenge.

Training time was also significantly reduced. The hybrid system completed training in approximately 34 minutes, whereas traditional models required more than a day. Energy consumption during training dropped to about 1 percent of that used by conventional systems, while operational energy use was reduced to roughly 5 percent.

The findings highlight broader challenges associated with current AI development. Large language models and similar systems often require extensive computational resources and can still produce inaccurate outputs. This combination of high energy demand and imperfect reliability has raised questions about long-term scalability.

As AI adoption expands across industries, energy infrastructure is under increasing pressure. Some data centers already consume power at levels comparable to small cities, prompting concerns about sustainability and resource allocation.

The researchers suggest that neuro-symbolic AI could offer a more sustainable path forward. By combining data-driven learning with structured reasoning, the approach may enable future AI systems to operate more efficiently while maintaining or improving performance.

Exit mobile version