AI’s Invisible Cost
Jul 25, 2025
2 min read
Author
By Maxime Pasquier, Investor & Rasmus Holt @ Blackwood

The rapid adoption of generative AI has created a paradox: a technology celebrated for efficiency and intelligence may be operating on an increasingly wasteful physical substrate.
Behind each neatly composed sentence from a chatbot lies a staggering demand for electricity, water, and raw materials. Yet, this environmental footprint remains largely hidden from public discourse.
The figures are no longer speculative. According to MIT research, the electricity consumption of data centers globally stood at 460 terawatt-hours in 2022. That would make data centers the 11th largest electricity consumer in the world, ahead of Saudi Arabia. And with generative AI driving demand, that number could double by 2026. The result? An infrastructure boom powered disproportionately by fossil fuels, as renewable energy capacity fails to match the pace of new GPU clusters. Fusion would be wonderful…
Mistral AI, to its credit, has now released the first lifecycle analysis (LCA) of a large language model (LLM). The results are sobering. Training its “Large 2” model generated over 20,000 tons of CO2e and consumed 281,000 cubic meters of water. Every 400-token response served by their assistant “Le Chat” consumes 1.14 grams of CO2e and 45 mL of water. Multiply that by billions of daily prompts, and the environmental tally begins to rival entire industrial sectors.
The work from Mistral, MIT, and others is a step in the right direction. But it’s not enough to publish whitepapers and hope for self-regulation. Environmental sustainability must become a first-class design principle in AI. Otherwise, we risk building a smarter digital world on top of a degraded physical one.


