Vietnam.vn - Nền tảng quảng bá Việt Nam

Are AI chatbots as 'power-hungry' as rumored?

AI chatbots are exploding with hundreds of millions of users every day, but behind that convenience is huge power consumption, raising concerns about sustainability.

VTC NewsVTC News19/09/2025

In just the last few years, ChatGPT has exploded in popularity, with nearly 200 million users entering more than a billion requests per day. Those seemingly “out of thin air” responses actually consume a huge amount of energy behind the scenes.

By 2023, data centers—where AI is trained and operated—will account for 4.4% of electricity consumption in the US. Globally, that figure is about 1.5% of total electricity demand. By 2030, consumption is expected to double as AI demand continues to climb.

“Just three years ago, we didn’t even have ChatGPT,” said Alex de Vries-Gao, a researcher on the sustainability of new technologies at Vrije Universiteit Amsterdam and founder of Digiconomist, a platform that analyzes the unintended consequences of digital trends. “ And now we’re talking about a technology that could account for almost half of the electricity consumed by data centers worldwide .”

Asking a question to a large language model (LLM) consumes about 10 times more electricity than a typical Google search. (Photo: Qi Yang/Getty Images)

Asking a question to a large language model (LLM) consumes about 10 times more electricity than a typical Google search. (Photo: Qi Yang/Getty Images)

What makes AI chatbots so power-hungry? The answer lies in their sheer scale. According to University of Michigan computer science professor Mosharraf Chowdhury, there are two particularly “power-hungry” stages: training and inference.

“However, the problem is that today's models are so large that they cannot run on a single GPU, let alone fit on a single server,” professor Mosharraf Chowdhury explained to Live Science.

To give you an idea of ​​scale, de Vries-Gao’s 2023 study found that a single Nvidia DGX A100 server can consume up to 6.5 kilowatts of power. Training an LLM typically requires multiple servers, each with an average of eight GPUs, running continuously for weeks or even months. The total power consumption is enormous: Training OpenAI’s GPT-4 alone consumed 50 gigawatt-hours, enough to power all of San Francisco for three days.

OpenAI's GPT-4 training process was enough to power all of San Francisco for three days. (Image: Jaap Arriens/NurPhoto/Rex/Shutterstock)

OpenAI's GPT-4 training process was enough to power all of San Francisco for three days. (Image: Jaap Arriens/NurPhoto/Rex/Shutterstock)

The inference phase is also very energy-intensive. This is where the AI ​​chatbot uses what it has learned to come up with a response for the user. Although inference requires less computing resources than the training phase, it is still extremely power-hungry due to the large number of requests sent to the chatbot.

As of July 2025, OpenAI says ChatGPT users are sending more than 2.5 billion requests per day. To respond instantly, the system must mobilize multiple servers to operate simultaneously. And that’s just ChatGPT, not to mention other widely popular platforms, like Google’s Gemini, which is expected to soon become the default choice when users access Google Search.

“Even in the inference phase, you can’t really save energy,” Chowdhury said. “It’s not about the data size anymore. The model is huge, but what’s bigger is the number of users.”

Researchers like Chowdhury and de Vries-Gao are now looking for ways to more accurately measure power consumption and find ways to cut it. For example, Chowdhury maintains a ranking called the ML Energy Leaderboard, which tracks the energy consumption of inference from open-source models.

However, most of the data related to commercial generative AI platforms remains “secret”. Large corporations such as Google, Microsoft or Meta either keep it secret or only publish very vague statistics that do not reflect the true environmental impact. This makes it very difficult to determine how much electricity AI actually consumes, what the demand will be in the coming years, and whether the world can meet it.

However, users can certainly press for transparency, which not only helps individuals make more responsible choices when using AI, but also helps promote policies that hold businesses accountable.

“One of the core problems with digital applications is that their environmental impact is often hidden,” said researcher de Vries-Gao. “Now the ball is in policymakers’ court: they have to encourage data disclosure so that users can take action.”

Ngoc Nguyen (Live Science)

Source: https://vtcnews.vn/chatbot-ai-co-ngon-dien-nhu-loi-don-ar965919.html


Comment (0)

No data
No data

Same tag

Same category

The beauty of Ha Long Bay has been recognized as a heritage site by UNESCO three times.
Lost in cloud hunting in Ta Xua
There is a hill of purple Sim flowers in the sky of Son La
Lantern - A Mid-Autumn Festival gift in memory

Same author

Heritage

;

Figure

;

Enterprise

;

No videos available

News

;

Political System

;

Destination

;

Product

;