X

ChatGPT consumes less power than expected, study claims

ChatGPT is, without a doubt, the most popular AI platform out there. Many people have naturalized the use of the chatbot for AI-powered tasks, similar to when people open Google Search automatically to make online searches. Some are concerned about the potential high energy consumption of using such services, though. However, a new report estimates that ChatGPT is not as power-hungry as previously thought.

Study says ChatGPT AI chatbot is less power-hungry than many believed

The NGO Epoch AI shared a study focused on calculating the energy consumption of a typical query on ChatGPT. Joshua You, the data analyst who carried out the tests, used the GPT-4o model as a reference. The results showed that a typical query to the popular AI chatbot consumes around 0.3 W/h. This is below the energy consumption of many appliances you have at home. For reference, a popular belief said that each ChatGPT query consumed around 3 W/h, a figure that is quite far off.

The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car,” You stated regarding the results. He also said that the 3 W/h old estimate comes from “fairly old research” that was “based on some napkin math seemed to be too high.” Basically, the previous research considered the use of older, less efficient hardware instead of more modern ones.

AI-focused companies and AI hardware vendors have managed to significantly optimize energy consumption in recent years. Each new generation of AI chips promises not only more power but also less energy consumption.

AI will gradually require more energy

That said, the path the AI ​​industry is taking will inevitably lead to more energy requirements. AI-focused companies are currently working on reasoning-focused models. These are more efficient at solving complex problems but require more computing power. Plus, upcoming AI agents will also be “smart” enough to perform as well as a human would in multiple jobs.

[The] AI will get more advanced, training this AI will probably require much more energy, and this future AI may be used much more intensely — handling much more tasks, and more complex tasks, than how people use ChatGPT today,” You said. “Reasoning models will increasingly take on tasks that older models can’t, and generate more [data] to do so, and both require more data centers,” he added.