Climate footprint of AI: Hey Chat-GPT, how much energy do you need?

The most exciting thing about Sam Altman's big dreams are two small numbers: 0.00034 kilowatt hours and 0.00032 liters. On his private blog , the head of Open AI once again fantasizes about superintelligence and casually quantifies Chat-GPT's resource consumption for the first time. Supposedly, one chatbot response requires as much energy on average as a modern LED lamp for about two minutes of light. A teaspoon of water is therefore enough for about a dozen questions to Chat-GPT.
Altman doesn't cite any sources for these comparative figures. Why should anyone trust a man who not only glosses over his AI 's resource requirements, but was also fired for allegedly knowingly lying to Open AI's board of directors?
The problem: There are hardly any reliable figures. All the big tech companies, Google , Meta, Microsoft , and Open AI, like to claim sustainability, but keep secret how much energy and water their language models consume. Generative artificial intelligence thus becomes a double black box. On the one hand, the companies conceal the data their models were trained with. On the other, no one knows what consequences the technology has for the climate and the environment.
In recent months, scientists and journalists have attempted to determine AI's resource consumption. At the end of May, for example, the journal MIT Technology Review published a comprehensive study . Physicist Andy Masley , climate researcher Hannah Richie , and mathematician Arthur Clune also conducted calculations and reached similar results. There are also dozens of studies, although these are often based on older models and do not take into account that generative AI has become significantly more efficient.
Although Silicon Valley's lack of transparency makes it difficult to provide precise figures, the estimates so far allow for some conclusions. Altman's figures do indeed coincide with the findings of independent researchers—however, the Open AI CEO ignores an important factor that massively influences AI's resource consumption. Nevertheless, there is a consensus in the scientific community on six key points.
1. A few questions to Chat-GPT are no problem
Those who are happy to accept help from Chat-GPT, Claude, or Gemini needn't feel guilty. While language models require more energy than a Google search, individual use of chatbots still accounts for only a small single-digit percentage of AI's total resource consumption. There are dozens of behavioral changes that have a greater impact than writing fewer prompts. For most people, Chat-GPT is merely a rounding error in their personal energy budget.
2. Videos consume enormous amounts of energy
The statement just made applies to text and images, but expressly not to videos. New models like Google's Veo 3, in particular, consume enormous amounts of energy. Even a five-second video requires around 700 times more energy than a single image. Anyone who regularly generates high-resolution videos or even produces short films using AI is significantly increasing their ecological footprint.
3. Resource consumption fluctuates greatly
Current language models vary greatly in their efficiency. Depending on which model you ask, resource consumption can vary by a factor of ten. The complexity of the prompt is also important. If you upload entire books or use deep research functions, much more calculations are required than for a simple question. Furthermore, it is unclear which models access which data centers. This determines water consumption and the electricity mix, which in turn are crucial for CO₂ consumption. Many data centers are located in regions that rely predominantly on fossil fuels such as coal, gas, and oil.
4. Data centers become a problem
When Altman mentions the energy consumption of a Chat-GPT response, he fails to mention that Open AI previously trained the model using powerful graphics chips. This takes place in massive data centers, which in the US already consume as much electricity as the whole of Thailand. Between a quarter and a third of this goes to the development and operation of AI models. This share will rise sharply; forecasts predict a tripling by 2028. New data centers are being built around the world, sometimes without consideration for the needs of the population. Cooling the systems requires enormous amounts of water, which is often already in short supply at the locations.
5. Nuclear power and fossil fuels are coming back
Renewable energies cannot fully cover the growing demand of data centers. The computers need constant power, which wind and solar power do not always provide. Amazon, Google and Microsoft have therefore been relying on nuclear power for some time now and are either building new nuclear power plants or reactivating decommissioned reactors. Altman has personally invested in a fusion energy start-up. Just last week, Meta announced that it would buy electricity from a nuclear power plant from the energy company Constellation Energy for 20 years. At the same time, the company is relying on gas-fired power plants, which are meeting with local resistance. Elon Musk is being particularly ruthless. For his company xAI, he has, among other things, built a gigantic data center in Memphis that is powered by 35 gas turbines . The exhaust fumes are polluting the neighborhood, which is predominantly populated by Black and poor people. Many people there complain of breathing difficulties and asthma and have to be treated in hospital .
6. There is hope
Technological progress alone will not solve the climate crisis, but there are positive developments. Since the release of Chat-GPT at the end of 2022, language models have become faster, more reliable, and more efficient. They now require significantly less energy to perform the same tasks. Graphics chips in data centers and the cooling systems for these systems are also evolving rapidly. The self-interest of tech companies is contributing to this. Resource consumption not only burdens their carbon footprint but also their quarterly balance sheet. The less energy AI consumes, the more money Google, Microsoft, and Open AI earn.
süeddeutsche