Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Chatgpt may not be as power-hungry as we have assumed once ncvrs.com

Chatgpt, the Openai ChatBot platform, cannot be as powerful as we have assumed once. However, your appetite is highly dependent on the use of Chatgpt, and according to a new study, AI models respond to the queries.

THE recent analysis Written by Epoch AI, a non -profit AI research institute tried to calculate how much energy a typical chatgpt query consumes. THE generally quoted stat Whether Chatgpt requires about 3 watts to answer a single question or ten times as much as Google search.

Epoch believes this is overestimating.

Using Openai’s latest default default model for GPT-4O as a reference, EPOCH has found that the average ChatGPT query is about 0.3 watts more than many household appliances.

“Energy use is not really a big deal for using normal devices, home heating or cooling or driving the car,” said Joshua You, an analyst at the analysis of the analysis, to techcrunch.

AI’s energy consumption – and its environmental impact, in a broader sense – is the subject of a disputed debate, as AI companies strive to quickly expand the footprint of the infrastructure. Only last week is a group of more than 100 organizations Published an open letter It calls for the AI ​​industry and regulators to ensure that new AI data centers do not exhaust natural resources and force utilities to rely on non -renewable energy sources.

You said that Techcrunch’s analysis was encouraged by what was described as obsolete earlier research. For example, he pointed out that the author of the 3 Watto estimates had assumed that Openai used older, less effective chips to run models.

EPOCH AI Chatgpt Power Consumption
Image loans:Era

“I have seen many public discurs that correctly realized that AI will consume a lot of energy in the years to come, but did not really describe the energy leading to AI,” he said. “Some of my colleagues noticed that the most widely announced estimation of the 3 watt -watt was based on quite old research, and some napkins seemed too high on mathematics.”

Obviously, the 0.3 Wattora value of the epoch is also an approximation; Openai did not publish the details needed for accurate calculation.

The analysis does not take into account the additional energy costs incurred by Chatgpt functions such as image generation or input processing. You have admitted that the “Long Input” Chatgpt queries, such as long files attached to, are likely to consume more electricity than a typical issue.

You said you expected Chatgpt’s initial energy consumption to increase.

“(A) AI will be more advanced, AI training is likely to require much more energy, and this future AI can be used much more intense -much more tasks and more complex tasks than people use Chatgpt today . ” said.

Although remarkable breakthroughs have occurred in AI efficiency in recent months, the AI ​​installation scale is expected to result in enormous, energy-efficient infrastructure expansion. Over the next two years, AI data centers for almost all California’s 2022 energy capacity (68 GW), According to a rand report– By 2030, the training of the border model may require the output performance of eight nuclear reactors (8 GW) to predict the report.

Chatgpt itself reaches a huge – and expanding – number of people, making its server equally huge. In addition to Opena and several investment partners, we spend billions of dollars on new AI data center projects over the next few years.

Openai’s attention, along with the rest of the AI ​​industry, has also become the reasoning models, which are generally capable of the tasks to be performed, but require more calculations to run. Unlike models such as GPT-4O, which respond almost immediately to the queries, reasoning models think for minutes every second before answering, it is a process that sucks more computing and thus power.

“Reasoning models are increasingly engaged in tasks that older models do not know and generate more data, and both require more data centers,” he said.

Openai began to release more powerful arguments, such as O3-MINI. But it seems unlikely, at least at this stage, that the increase in efficiency offset the increased energy demand of the “thinking” process of reasoning models and the increase in the use of AI throughout the world.

He suggested that people who are concerned about AI Energy Footprint use applications such as Chatgpt, or choose models that minimize the required calculation – to the extent that it is realistic.

“You can try to use smaller AI models such as (Openai) GPT-4o-Mini”-said, and use them carefully to process or produce a lot of data. “

Leave a Reply

Your email address will not be published. Required fields are marked *