How bad is AI for the environment?
I get asked this every time I speak. I finally did the research.
Welcome to the 19 new FullStack HR readers who joined last week!
The ambition is high with this newsletter - to be the guide for organisations in this AI transformation that is happening!
If you aren't yet subscribed, join the other like-minded people in this free newsletter by subscribing below:
This week’s FullStack HR is brought to you by….HR Tech Europe!
More than ever, we need to talk to each other.
Not “align on strategy” talk. Talk for real.
The topic right now in HR? It’s one we all need to learn. Fast. And the need has never been this big, the pace has never been faster.
I’m a 100% certain that we can’t figure this out alone. We need rooms, conversations, and smart people who challenge how we think.
That’s happening in Amsterdam, April 22-23 at HR Tech Europe.
People from Google, LEGO, Coca-Cola, Microsoft, ING, Maersk, Oatly, and a hundred other companies will be there.
Josh Bersin is doing three sessions this year, including his first ever "Ask the Experts" and a unique session with Gianpaolo from Cisco. And a brand new format called HR Tech Intensives, where you actually work through problems together instead of just listening.
And of course, I’ll be there, and I’ll be hosting a practical session, so bring your laptops, and we’ll build stuff together.
If you’re in HR, it’s free.
Come talk.
Register here
Happy Thursday,
I messed up on Monday and put the wrong link to the Claude Cowork Bootcamp…yey me. This is the right link….sorry!
But enough of all of this, let’s get to it!
Every time I give a talk about AI, someone asks it.
Sometimes early, sometimes at the end, sometimes with a hint of accusation, and sometimes out of genuine curiosity. But sooner or later, the question pops up: “What about the environmental impact?”
I’ll be honest, sometimes I brushed it off with a half-answer.
Not because the question is bad, but because I never quite understood why AI specifically became the big symbolic issue for digital environmental impact. We’ve been streaming Netflix for twenty years, Zoomed our way through a pandemic, and googled everything from symptoms to holiday destinations without anyone raising their hand to ask what it’s doing to the planet.
But the moment someone opens ChatGPT, the entire climate discussion seems to land in their lap.
This question has been nagging at me for a while, though, and I’ve spent quite a few hours with ElevenReader in my ears going through the IEA’s energy reports, Stanford HAI’s AI Index, OECD publications, and peer-reviewed papers trying to form a proper opinion. I also used Claude to help me synthesise and structure all of it into this article, because the irony of using AI to write about AI’s environmental footprint felt too appropriate to pass up. The sources and data are real, the position is mine, and the help was computational.
What I found was considerably more nuanced than the public debate usually suggests.
The individual footprint is vanishingly small
A typical text query to ChatGPT uses roughly 0.3 watt-hours according to Epoch AI’s analysis from February 2025. OpenAI’s Sam Altman has confirmed a similar figure of 0.34 Wh. Google published in its own methodology report in 2025 that a median text prompt in Gemini uses roughly 0.24 Wh, produces 0.03 grams of CO₂, and consumes 0.26 milliliters of water.
That puts it in the same ballpark as Google’s own estimate for a standard search, roughly 0.3 Wh, though that figure comes from a 2009 blog post and is likely lower today given sixteen years of hardware improvements.
The comparison isn’t perfect, but the order of magnitude is clear. A typical AI text query is not the energy monster that early estimates suggested.
The figure that often gets thrown around, that an AI query uses “ten times more electricity than a search,” comes from Alex de Vries’ article in Joule in October 2023 and was referenced by the IMF, among others. It was likely accurate at the time, but is now outdated for typical text queries, since both hardware and model architectures have become dramatically more efficient.
Digital energy comparisons are harder than they look
It’s tempting to line up AI alongside streaming, video calls, and email to show that it’s a relatively small activity. And I’ve done this several times myself, and my early drafts of this article had a neat little table with watt-hours per activity. But the more I dug into the sources, the more I realized that these comparisons are difficult to make.
The IEA’s own analysis from 2020 found that one hour of Netflix streaming consumed roughly 77 Wh, which was already about 90 times lower than the widely cited Shift Project estimate that had made headlines.
The result is highly sensitive to what you count, such as that device type matters enormously (a TV uses roughly 100 times more electricity than a phone), and whether you include only the data center or the full chain from server to screen changes the answer dramatically. Carbon Brief’s deep dive confirmed the IEA’s lower range and showed just how wildly the numbers vary depending on assumptions.
For video calls and email, the source picture is even weaker. A Purdue/MIT/Yale study from 2021 gave a range of 150 to 1,000 grams of CO₂ per hour of video conferencing, which is so broad it essentially tells you that nobody knows. For email, I found estimates ranging from 0.01 Wh to 13 Wh per message, a spread of more than a thousand times.
Most digital energy comparisons you see in the press blend methodologies, mix old and new estimates, and conflate different system boundaries, making them look far more precise than they are. What we can say is that a typical AI text query, at roughly 0.24 to 0.34 Wh, is a very small unit of energy by any reasonable standard, and it’s comparable to a lot of other digital activities that we do.
But the scale is a different story
What makes the question impossible to dismiss is not the individual prompt but the total infrastructure. According to the IEA’s “Energy and AI” report from April 2025, the world’s data centres consumed roughly 415 terawatt-hours of electricity in 2024, about 1.5 percent of global electricity use. In the IEA’s base scenario, that more than doubles to around 945 TWh by 2030, equivalent to Japan’s entire electricity consumption today.
Five companies plan to spend close to 700 billion dollars on AI infrastructure in 2026 alone. ChatGPT reached 900 million weekly active users by February 2026. AI is being built into search engines, email clients, office software, and autonomous agents in a way that makes it an entirely new computational layer underneath the digital economy, and that layer requires physical infrastructure on a massive scale.
The efficiency gains are real and impressive. Stanford HAI documents a 280-fold cost reduction for GPT-3.5-level inference in just two years. But it is precisely this efficiency that drives even faster growth in usage, exactly as Jevons paradox predicts.
When GPT-4o’s API pricing dropped from 30 dollars to 2.50 dollars per million input tokens, usage exploded. Google’s Gemini token usage grew 130 times in 18 months. The IEA’s scenarios show that even with significant efficiency gains, absolute electricity consumption continues to rise in nearly all plausible futures.
Water?
Energy gets almost all the attention, but the water question is in many ways more acute.
Before getting into the alarming cases, it’s worth putting AI’s water use in proportion. While I was writing this article, my friend Mathias Sundin published an excellent piece on Warp News covering much of the same ground, building on work by Andy Masley who has done some of the most detailed analysis on this topic.
According to Masley’s calculations, all of America’s golf courses use roughly 5,474 times more water than ChatGPT. (Yes, you read that right 5474 times more than the ENTIRE ChatGPT.)
In Maricopa County, Arizona, golf courses account for 3.8 percent of the county’s water while data centres account for 0.12 percent. There are individual golf courses that consume more water than all of ChatGPT globally. And the widely shared claim that a single AI prompt uses “a bottle of water” is a distortion of a study that found 500 millilitres per 20 to 50 prompts, roughly 0.3 millilitres per query.
That said, proportion isn’t the whole story. As Undark reported, the issue is not that AI currently uses more water than golf or agriculture. The issue is the pace and concentration of growth, and who bears the local cost. The World Resources Institute reports that two-thirds of data centres built since 2022 are in water-stressed areas.
Both things are true simultaneously. AI’s total water consumption is small compared to many other industries. And in specific communities where data centers are being built at speed, the local impact is already causing real conflict.
So where does this leave us? Or me?
I still don’t really understand why AI specifically became the big symbolic question for digital environmental impact. We’ve accepted that digitalization costs energy for decades, and the individual AI query is energetically comparable to things we do hundreds of times a day without a second thought.
I’m also not exactly a model citizen when it comes to the environment. I fly too much, and I know it. So I’m not going to stand here moralizing about other people’s Claude habits.
But I think the question deserves a serious answer rather than a shrug.
Data centres currently account for roughly 0.5 percent of global CO₂ emissions, compared to aviation at roughly 2.5 percent and the textile industry at 2 to 8 percent. AI is not the world’s biggest environmental problem. But data centre electricity is growing at roughly 15 percent per year, it is one of very few sectors where emissions are projected to keep rising.
The serious answer, the one that best reflects what the IEA, the OECD, Stanford HAI, and UNEP say, is that AI’s environmental footprint is real and growing fast, but it is not a climate catastrophe in itself.
It is a serious governance problem that requires us to demand better infrastructure, not to stop using the technology.
The IEA also estimates that AI applications could enable emission reductions equivalent to 5 percent of global energy-related emissions by 2035. But that doesn’t happen by itself, and it doesn’t happen if we build AI infrastructure on dirty power in water-stressed regions without asking questions.
I think that the question shouldn’t be “should I use AI if I care about the environment?” The question should be why we don’t ask the same thing about everything else we do on a computer, and why we’re not demanding more from the companies building the infrastructure.
I’m going to keep using AI. I’m going to keep talking about it as the most transformative technology most organizations have to deal with right now. But I’m going to stop brushing off the environment question with a shrug. It deserves better than that.



