The Spirit of Energy, Niagara-Mohawk Building in Syracuse.
The moment I stepped into a data center on the outskirts of London, a windowless vault chilled to near-arctic precision, lit by the blink of servers that devoured power like industrial furnaces. I understood the extent of energy needed to power AI. These were not ethereal minds floating in silicon; they were machines sweating watts, humming with the appetite of small cities. Artificial Intelligence, it turns out, is not transcendence. It is circuitry. And circuitry is greedy. As OpenAI CEO Sam Altman told the US Congress:
“Eventually the cost of intelligence, the cost of AI, will converge to the cost of energy... the abundance of it will be limited by the abundance of energy.”
The International Energy Agency’s 2025 report, Energy and AI, lays bare what most of Silicon Valley's podcast-talkers prefer to euphemize. There is no such thing as disembodied intelligence. Even the cloud has cables. Training AI models today consumes megawatt-hours of power, and running them, your friendly chatbot or video generator, can require as much electricity as dozens of homes. The largest AI data centers under construction are projected to devour as much electricity as two million households.
This isn’t just a curiosity. It’s a confrontation with scale, one with real implications. For governments, it’s a call to overhaul energy strategies. The IEA is unambiguous: countries that can provide affordable, reliable, and clean electricity at scale will lead the AI frontier. That means investment, not just in generation, but in the creaking and often overlooked grid infrastructure. And quickly. The IEA warns that building new transmission lines takes four to eight years in advanced economies. Wait times for key components like transformers and turbines have doubled. Without faster action, up to 20% of planned data center projects may stall.
For tech companies, the outlook lies in transparency and innovation. The IEA notes rapid improvements in model and hardware efficiency, but those gains are offset by the explosive scale of deployment. If these firms are serious about their ESG posturing, they must account for, and actively reduce, their energy footprints. As Google did using DeepMind's AI. There is also an ethical obligation to avoid locking the digital economy into fossil-powered infrastructure. Because make no mistake: if data centers expand without a clean energy base, they risk blowing past the carbon budgets of entire nations.
For the climate, the stakes are even sharper. AI data centers could contribute up to 500 Mt of emissions by 2035 in a high-demand scenario. That’s more than the annual emissions of Australia. And while AI could help cut emissions in other sectors, via smarter grids, optimized logistics, and predictive maintenance, the net gains remain modest, equivalent to perhaps 5% of energy-related emissions by 2035. This is not a path to climate redemption. It is, at best, a modest offset.
The IEA doesn’t just diagnose; it prescribes. Meeting AI's energy appetite sustainably requires a strategic energy mix. Half of future demand could be met by renewables, but only with substantial buildout in storage and grid flexibility. Natural gas will likely remain a key dispatchable source, though its role must be transitional. Nuclear, particularly small modular reactors (SMRs), and advanced geothermal are also on the table. What's non-negotiable is investment in transmission and smarter data center siting, placing them near existing grid strength and incentivizing operational flexibility. It’s a blunt fact: an AI data center is ten times more capital-intensive than an aluminium smelter. You can’t afford to plug it in and pray.
“Data centres accounted for around 1.5% of the world’s electricity consumption in 2024, or 415 terawatt-hours (TWh). The United States accounted for the largest share of global data centre electricity consumption in 2024 (45%), followed by China (25%) and Europe (15%). Globally, data centre electricity consumption has grown by around 12% per year since 2017, more than four times faster than the rate of total electricity consumption. AI-focused data centres can draw as much electricity as power-intensive factories such as aluminium smelters, but they are much more geographically concentrated. Nearly half of data centre capacity in the United States is in five regional clusters.”
The paradox the report keeps circling back to is this: AI is both a voracious energy consumer and a promising energy optimizer. The same tools that drive up electricity demand can also be deployed to manage and reduce it. Already, AI assists in grid fault detection, shaving outage durations by up to 50%. It enhances renewable integration with better forecasting. In the oil and gas sector, it refines exploration and trims methane leaks. And through virtual sensing and control, AI can unlock up to 175 GW of transmission capacity, without laying a single new line.
Yet here too the lag is telling. The tech sector is sprinting toward AI ubiquity, with $300 billion earmarked for infrastructure investment next year. Meanwhile, only 2% of energy start-up funding touches AI. That’s not just a missed opportunity. It's strategic negligence. Energy firms face data gaps, cybersecurity anxieties, and a chronic deficit of digital talent. The IEA’s warning is direct: without upskilling the energy workforce in AI literacy, the sector will fall behind, and with it, our ability to manage the transition.
There is, inevitably, a geopolitical wrinkle. The AI supply chain is fragile and asymmetrical. China controls nearly all global gallium refining, a key input in power-efficient chips. Taiwan’s TSMC dominates chip fabrication. ASML in the Netherlands makes the lithography machines that enable it all. These are not just commercial relationships; they are geopolitical dependencies, shot through with risk. As AI becomes more embedded in national infrastructure, energy security will increasingly mean chip security.
The IEA’s report lands with a cool sobriety. There’s no alarmism, no utopia. Just the facts: AI is infrastructure. It is vulnerable, power-hungry, and full of promise, but only if we plan accordingly. The next decade will not be defined by flashy demos or viral breakthroughs. It will be shaped by grid interconnectors, regulatory coherence, and who has enough skilled engineers to keep the lights on.
The question is no longer what AI can do. It is whether the rest of our systems can catch up. I think back to those data centers on the edge of London, machines sweating watts in eerie synchronicity. That’s where the real story is, in the voltage, in the cables, in the policy memos nobody reads. We built intelligence. Now we must power it, with clean, reliable energy!
And the promised future with scientific and health breakthroughs depends on whether we can.
Stay curious
Colin
It is worth reading Google DeepMind's new paper - "AlphaEvolve discovered a simple yet remarkably effective heuristic to help Borg orchestrate Google's vast data centers more efficiently. This solution, now in production for over a year, continuously recovers, on average, 0.7% of Google’s worldwide compute resources. This sustained efficiency gain means that at any given moment, more tasks can be completed on the same computational footprint."
https://deepmind.google/discover/blog/alphaevolve-a-gemini-powered-coding-agent-for-designing-advanced-algorithms/?fbclid=IwY2xjawKRwH5leHRuA2FlbQIxMQBicmlkETFuRXBLQ3plM0pkVWlxTkpSAR4J4vITiE7M1OS40cEJoY_N3vsFbMRjOOVQTkeUs45EuVOcT0jLrwayztcQYw_aem_LU5BINrv1rHpGKPiPE9inQ
One thing that I find perplexing: data centers in general, and AI in particular, use massive amounts of energy. That's given. The energy in the form of electricity is then converted to heat. The heat is harmful to the machines that generate it, so we use even more energy to power massive air conditions to refrigerate the data centers.
To this day, nobody seems to have even attempted to figure out how we might recycle that heat and put it to good use. If we could capture that heat, and use it to generate more electricity, we'd need fewer or smaller power sources, and fewer or smaller air conditioners. Thus not only generating additional energy but also reducing the amount of energy needed simultaneously.
CPU's and GPU's typically have massive heat sinks attached to disburse the heat. What a waste of energy!