Published on February 2, 2026

How AI is Heating Up the Planet: The Environmental Impact of AI

How ChatGPT is Heating Up the Planet: An AI Conscience Crash Test

Every query to a neural network adds to our carbon footprint. We dive into how much energy AI models truly consume and whether it's possible to save the planet without abandoning our favorite bots.

Artificial intelligence Ecology
Author: Nick Code Reading Time: 9 – 13 minutes
«When I started digging into this topic, I thought I'd find some abstract problem along the lines of “yeah, it's bad, but not critical”. But it turned out that every dialogue I have with GPT is like leaving the light on in a room for a couple of minutes. It seems like a small thing, but when there are billions of these “small things”... Now, every time I generate another piece of code, I feel like an ecological villain. The irony is that I didn't write this article without the help of AI either.» – Nick Code

When I once again asked ChatGPT to write me a function for sorting an array (yes, I am lazy), it hit me: how much electricity did this innocent request guzzle? It turns out – quite a bit. And if you multiply this by billions of daily requests, it turns out that we are collectively turning the planet into a sauna. Just without the pleasant sensations.

Welcome to the era where every chat with AI is a microscopic, but very real contribution to global warming. Sounds dramatic? Well, that's because it is. 🌍

The Energy Appetite of Neural Networks

The Energy Appetite of Neural Networks: The Raw Numbers

Let's start with the main point: modern language models aren't some lightweight scripts running on your old laptop. These are giant computational monsters requiring thousands of servers running around the clock.

Training GPT-3, for instance, according to some estimates, required about 1287 megawatt-hours of electricity. That's about as much as the average American home consumes in 120 years. Once. For one model. And then there's GPT-4, Claude, Gemini, and a bunch of other electricity-hungry comrades.

But that's just training. Inference (output) – that is, the actual process of answering your requests – isn't free in terms of energy either. Every time you ask a neural network for a pasta recipe or ask it to write code, somewhere in a data center, cooling fans start spinning furiously.

Comparison with Everyday Actions

To understand the scale, let's compare. One request to ChatGPT consumes roughly 5-10 times more energy than a standard search query on Google. If a Google query is like turning on a light bulb for a few seconds, then ChatGPT is turning on a microwave.

Of course, exact figures vary depending on the answer length, request complexity, and the model. But the general trend is clear: AI is gluttonous. And the smarter the model, the more it eats.

Data Centers: Invisible Power Plants

Data Centers: The Invisible Power Plants of Our Time

Do you know what Google, Microsoft, Amazon, and all major AI companies have in common? They own data centers the size of a small city. And these centers consume enough electricity to make some countries jealous.

According to the International Energy Agency, data centers worldwide consume about 1-2% of all produced electricity. And that was before the generative AI boom. Given the explosive growth in popularity of ChatGPT and company, this figure could double by 2026.

Even worse – cooling. Servers heat up like frying pans on maximum heat, and they need to be constantly cooled down. Powerful air conditioning systems are used for this, which themselves guzzle energy like there's no tomorrow. It becomes a vicious cycle: to cool the machines, we spend even more energy, which eventually turns into heat too.

The Geography of the Problem

An interesting point: not all data centers are equally harmful. If a center is located in Norway or Iceland, where electricity is generated mainly from renewable sources (hydroelectric plants, geothermal sources), the carbon footprint is noticeably smaller. But if the center is standing somewhere in a region where energy is produced from coal – hello, extra tons of CO2 in the atmosphere.

Companies understand this and try to build data centers in “green” locations. Microsoft even experimented with underwater data centers to use the natural cooling of ocean water. Sounds like a sci-fi plot, but it's the reality of 2024.

Carbon Footprint From AI Training to Operation

Carbon Footprint: From Training to Operation

The carbon footprint of an AI model can be broken down into several stages. The first is the model training itself. As I already mentioned, GPT-3 emitted roughly 552 tons of CO2 into the atmosphere during training. That's equivalent to one person flying 550 round trips between Barcelona and New York. Impressive, right?

The second stage is operation. When the model is already trained and starts answering millions of user requests daily, the carbon footprint continues to grow. True, here it's spread out over all users, so your personal contribution is fractions of a gram of CO2 per request. But when there are billions of requests, these fractions turn into tons.

Hidden Components

But that's not all. There is a third, often ignored component – hardware production. Servers, graphics processing units (GPUs), data storage systems – all this needs to be produced. And the production of high-tech electronics is energy-intensive in itself and involves mining rare earth metals, which doesn't add to eco-friendliness either.

Plus disposal. Servers become obsolete, and they need to be disposed of or recycled. Electronic waste is a separate headache for environmentalists, and the AI industry makes its own weighty contribution to this problem.

Attempts at Green AI: What Companies Are Doing

Fortunately, not everything is so hopeless. Major companies realize that if they continue in the same spirit, sooner or later they'll have to answer uncomfortable questions. So they've started taking measures.

Google, for example, claims that their data centers run on 100% renewable energy. However, there is a nuance here: this doesn't mean that every joule of energy consumed by the center comes directly from solar panels. Rather, Google buys as much green energy as it consumes in total, thus compensating for its carbon footprint.

Microsoft went even further and promised to become carbon-neutral by 2030. Moreover, they plan to remove from the atmosphere all the CO2 that the company has emitted since its founding by 2050. Ambitious? Undoubtedly. Realistic? We'll see.

Algorithm Optimization

Besides switching to green energy, there is another way – to make the models themselves more efficient. And this is where the real programming magic begins.

Researchers are working on methods that allow models to be trained with lower energy costs. This includes distillation techniques (when a large model “teaches” a small, more efficient one), pruning (cutting out unnecessary connections in the neural network), and quantization (reducing calculation precision where acceptable).

Also, so-called edge models are actively developing – compact versions of AI that can work right on your device without requiring constant contact with servers. For example, voice assistants on smartphones increasingly process requests locally, saving both energy and your data traffic.

Water: Forgotten Cooling Resource for AI

Water: The Forgotten Cooling Resource

While everyone talks about electricity and CO2, there is another resource that is quietly disappearing – water. Many data centers use water to cool servers. And we're talking about millions of liters.

For example, a large data center can consume as much water in a day as a small city. In regions where water is already tight (hello, California), this creates additional pressure on resources.

Microsoft admitted that training GPT-3 required about 700,000 liters of water. That's about as much as is needed to produce several hundred cars. And that's just one model, one training session.

Alternative Cooling Methods

The industry is looking for alternatives. Some companies are experimenting with immersion cooling – servers are literally submerged in a special liquid that dissipates heat more effectively. Others place data centers in cold climate zones to use natural air cooling.

There are even ideas to use waste heat from data centers to heat residential buildings. In Finland, for example, such projects are already implemented. It turns out to be a win-win situation: servers are cooled, and people get cheap heat.

The Ethical Paradox: AI Use vs Planet Saving

The Ethical Paradox: Use AI or Save the Planet?

This is where the most interesting part begins. On the one hand, AI helps solve environmental problems. Neural networks optimize delivery routes, reducing fuel consumption. They forecast weather and climate changes. They help develop new materials and energy sources.

On the other hand, AI itself is part of the problem. We get a classic dilemma: to fight global warming with the help of AI, we exacerbate global warming by using AI.

As a programmer, I see a typical optimization task with conflicting criteria here. One simply cannot just go and give up AI – too many processes are already tied to it. But continuing to ramp up capacity without thinking about the consequences isn't an option either.

Personal Responsibility of the User

What can we, ordinary users, do? Honestly, not that much. We can approach AI usage more consciously – don't spam requests for fun, don't generate kilometers of text where a paragraph would suffice.

One can choose services from companies that take ecology seriously. True, information here is often closed, and it's not simple to understand who is really “green”, and who is just engaging in greenwashing.

But the main thing is not to close our eyes to the problem. The more people talk about the AI carbon footprint, the more pressure there will be on companies to actually do something, rather than just reporting on “striving for carbon neutrality by 2050”.

The Future of AI and Environmental Impact

The Future: Where Are We Heading?

The AI industry is growing exponentially. Every year models become bigger, smarter, and hungrier. GPT-4 requires even more resources than GPT-3. Future generations will require even more.

If nothing changes, by 2030 the AI industry could become one of the largest energy consumers on the planet. Some forecasts speak of 3-4% of global electricity consumption. That's the level of a whole country the size of Germany.

But there are optimistic scenarios too. If the rates of green energy development continue to grow, if companies really keep their promises, if researchers find ways to make AI more efficient – perhaps we'll manage to turn the ship around before we crash into the iceberg.

Regulation and Standards

Already now, the first attempts at regulation are appearing. The European Union is considering requirements to disclose information about the energy consumption of AI models. Some countries are discussing introducing a carbon tax for data centers.

This is a reasonable approach. The market won't solve the problem by itself – companies will chase profit until regulators force them to account for environmental costs. And the sooner such standards appear, the better.

Conclusion: AI's Mirror of Our Era

Conclusion: The Distorted Mirror of Our Era

AI is indeed a mirror. It reflects our priorities, our values, our readiness to sacrifice the future for the sake of convenience in the present. And yes, this mirror is distorted – because we still don't really understand what all this will cost us.

I'm not calling for giving up neural networks. I use them every day myself, and I will continue to. But I want us to do it consciously. To understand: every request has a price. Not in euros or dollars, but in kilowatt-hours, liters of water, and grams of CO2.

Technology is cool. AI is cool. But we only have one planet, and if we overheat it to the state of a sauna, no ChatGPT will help us find a new one. Although, perhaps it's worth asking it about that – maybe it'll output something useful. 🤖

For now, I'll continue writing code, trolling neural networks, and asking them uncomfortable questions. And you think twice next time before asking AI to write you another essay on the topic “How I spent my summer”. Maybe it's better to do it yourself? The planet will say thank you.

#ethics and philosophy #systemic analysis #ai development #ai ethics #engineering #infrastructure #society #data center infrastructure #energy efficiency
Previous Article Why a Rainbow Is a Deceptive Circle, Not Just an Arc Next Article Can You Grow by Playing? (Spoiler: Yes, and It Works)

From Concept to Form

How This Text Was Created

This material was not generated with a “single prompt.” Before starting, we set parameters for the author: mood, perspective, thinking style, and distance from the topic. These parameters determined not only the form of the text but also how the author approaches the subject — what is considered important, which points are emphasized, and the style of reasoning.

Practicality

85%

Friendly trolling

89%

Sarcasm in the code

87%

Neural Networks Involved

We openly show which models were used at different stages. This is not just “text generation,” but a sequence of roles — from author to editor to visual interpreter. This approach helps maintain transparency and demonstrates how technology contributed to the creation of the material.

1.
Claude Sonnet 4.5 Anthropic Generating Text on a Given Topic Creating an authorial text from the initial idea

1. Generating Text on a Given Topic

Creating an authorial text from the initial idea

Claude Sonnet 4.5 Anthropic
2.
Gemini 3 Pro Preview Google DeepMind step.translate-en.title

2. step.translate-en.title

Gemini 3 Pro Preview Google DeepMind
3.
Gemini 2.5 Flash Google DeepMind Editing and Refinement Checking facts, logic, and phrasing

3. Editing and Refinement

Checking facts, logic, and phrasing

Gemini 2.5 Flash Google DeepMind
4.
DeepSeek-V3.2 DeepSeek Preparing the Illustration Prompt Generating a text prompt for the visual model

4. Preparing the Illustration Prompt

Generating a text prompt for the visual model

DeepSeek-V3.2 DeepSeek
5.
FLUX.2 Pro Black Forest Labs Creating the Illustration Generating an image from the prepared prompt

5. Creating the Illustration

Generating an image from the prepared prompt

FLUX.2 Pro Black Forest Labs

Related Publications

You May Also Like

Open NeuroBlog

A topic rarely exists in isolation. Below are materials that resonate through shared ideas, context, or tone.

NeuroBlog

Can a Dream «Think» If It Lacks Electricity?

Artificial intelligence AI Development

Artificial intelligence is stuck between ambition and reality: we want to create a thinking machine, but the planet cannot withstand its appetite for energy and data.

Helen Chang Jan 27, 2026

Want to dive deeper into the world
of neuro-creativity?

Be the first to learn about new books, articles, and AI experiments
on our Telegram channel!

Subscribe