Posted on

The AI-energy paradox: Will AI spark a green energy revolution or deepen the global energy crisis? — Part 2

As AI’s energy consumption surges, concerns over its environmental impact grow. However, AI also offers solutions — optimising data centre cooling, managing smart grids, and reducing industrial energy waste. This article explores how AI-driven efficiency can help counterbalance its own power demands, creating a path toward more sustainable energy use.

AI-driven efficiency: Mitigating the carbon toll

While AI’s energy consumption is undeniably large, AI technologies also offer powerful tools to cut energy waste and emissions across many industries. From cooling data centres to optimising factory lines and smart grids, AI-driven efficiency gains can act as a counterweight to AI’s own power use. In essence, there is an opportunity for a positive feedback loop: using AI to save energy even as we use energy to run AI.

Some notable examples of AI-enabled efficiency breakthroughs:

  • Data centre cooling optimisation: Google’s DeepMind cut data centre cooling energy by 40 per cent by predicting server loads and adjusting cooling in real time.
  • Next-gen cooling technologies: Advanced cooling solutions, such as direct-to-chip liquid cooling, have been shown to reduce server energy use by ~30 per cent , with liquid cooling now used in up to 45 per cent of new European facilities.
  • AI-managed micro-grids: In regions like Ohio and Texas, experimental micro-grids leverage AI to balance renewable energy with data centre power draw , cutting renewable curtailment by about 22 per cent.
  • Industrial and building energy management: AI applications have helped Toyota reduce energy consumption by 29 per cent on certain manufacturing processes and enabled commercial buildings (such as 45 Broadway in Manhattan) to achieve nearly 16 per cent HVAC energy savings through intelligent controls.
  • Building energy management: In commercial buildings, AI has shown impressive results in cutting power usage without sacrificing comfort. A notable case is 45 Broadway in Manhattan, where implementing an AI HVAC optimisation system led to a 15.8 per cent reduction in HVAC energy use. AI algorithms learned the building’s patterns and adjusted heating/cooling more intelligently. Similarly, AI-based controls for lighting and appliances can yield up to 30 per cent energy savings in buildings. Multiply these gains across millions of buildings and homes, and the potential energy savings are enormous.

These examples illustrate a hopeful counterpoint to AI’s energy appetite: the energy savings AI enables in other areas could, in theory, offset a significant portion of the energy AI consumes. Smarter grids, smarter buildings, smarter transportation (AI-optimised logistics, etc.) all contribute to lower overall demand.

A Shell analysis suggests AI applications could halve the carbon intensity of global energy by 2050 through such measures — coordinating renewables, improving efficiency, and innovating in materials (for example, using AI-driven design to create wind turbine blades that generate 40 per cent more power.

However, a critical question remains: Can AI’s energy-saving contributions catch up with its own growing consumption? This is the crux of the AI-energy paradox.

The AI-energy paradox: Do savings and consumption converge?

Right now, the net impact of AI on global energy is still an increase in demand. AI’s usage is growing so rapidly that efficiency gains, as valuable as they are, haven’t yet kept pace.

For instance, even as Google’s AI cut 40 per cent of cooling energy, the expansion of Google’s AI computing meant total energy use still rose. The near-term trend is divergence — AI driving more power use overall, despite localised savings.

Current figures bear this out. The US Department of Energy found that data centres (thanks largely to AI growth) consumed about 4.4 per cent of US electricity in 2023, and are on track to reach between 6.7 per cent and 12 per cent by 2028.

In other words, efficiency improvements are not projected to stop a doubling (or more) of data centres energy draw in the next five years.

A recent Electric Power Research Institute analysis likewise forecasts US data centres could hit nine per cent of national electricity use by 2030, up from ~four per cent today. Clearly, in the short run, AI’s footprint is outpacing the savings it enables elsewhere.

Also Read: A step-by-step guide to protecting your time and energy: The art of pre-qualification

Over the longer term, there is a possibility (not a guarantee) that the curves could converge. As AI matures, there’s intense research focus on efficiency: more efficient algorithms, specialised AI chips that deliver more performance per watt, better cooling, and so on. If each new generation of AI hardware is significantly more efficient, the growth in AI’s energy use could level off.

For example, tech firms are now prioritising energy efficiency over pure performance gains — a shift from the early “move fast” approach. Future AI models might be designed to be smaller or use smart techniques (like model sparsity or on-demand activation) that save energy.

Policymakers are also starting to push for convergence. The EU’s proposed AI Act will require large AI models to demonstrate 15 per cent energy efficiency improvements over previous generations — effectively slowing deployment of ultra-large models until they are more efficient (one reason rumours suggest GPT-5 might be delayed until such standards can be met). Governments may introduce carbon taxes or energy caps that make it economically unattractive to run wasteful AI systems, forcing innovation towards frugality.

So, will spending and savings converge? Optimistically, yes — but likely not until late this decade or beyond.

In a scenario where AI’s growth moderates and efficiency tech accelerates, we could see AI’s net impact plateau or even turn net-negative on emissions (especially if AI helps integrate huge amounts of renewables, as Shell’s scenario imagines.

But for the next 5-10 years, business leaders should plan for a world where AI means higher energy consumption and carbon output, and manage that reality accordingly.

The implication for corporates is twofold:

  • Invest aggressively in AI-driven efficiency projects within your own operations (to capture savings that can offset your AI usage).
  • Anticipate energy costs and capacity needs rising with AI, and incorporate that into everything from site selection (do your data centre/cloud regions have spare power capacity?) to vendor selection (choose partners with greener energy and efficient infrastructure).

In short, don’t assume the problem will solve itself. Proactive action is needed to bend the curve.

Accelerating the renewable transition to power AI

If AI is to spark a green energy revolution instead of exacerbating the crisis, a massive scale-up of clean energy is required. Renewables (solar, wind, hydro) need to grow in tandem with AI compute demand, and AI can be a catalyst to accelerate that growth. But it won’t happen automatically; it requires strategic investments and innovation.

On the plus side, AI is already helping get more out of renewables. We saw how AI can optimise wind and solar output (e.g. smarter inverters yielding 18 per cent more solar farm efficiency. AI can forecast weather and adjust operations to maximise renewable energy capture and reduce downtime.

For instance, autonomous AI-driven networks of electric vehicle (EV) chargers can collectively act as a 450 GWh battery for the grid, smoothing out renewable fluctuations by intelligently timing charging. AI is also being applied to breakthrough research — like using quantum computing and AI to design advanced materials for solar panels or wind turbines, potentially boosting their efficiency dramatically.

However, even optimistic efficiency gains won’t fully bridge the gap. The scale of new clean power needed is enormous.

A McKinsey study estimates that in Europe alone, an additional US$250-300 billion in grid infrastructure upgrades will be required by 2030 to handle 150 TWh of new AI-related electricity demand and connect enough renewables to supply it.

This includes new transmission lines, grid storage, and smarter distribution — essentially building a bigger, smarter grid to feed AI. Without such investment, renewable deployment could lag and AI would end up being powered by whatever is available (often coal or gas).

To put numbers on it: The world added about 300 GW of renewable capacity in 2022. If AI demand is rising by hundreds of TWh, we likely need to add hundreds more GW of renewables per year on top of current plans just to keep AI from increasing fossil fuel use.

Policymakers are starting to respond — the US Inflation Reduction Act, Europe’s Green Deal, China’s massive renewables build-out — all boost clean energy, which indirectly supports AI’s growth sustainably. But targeted actions may be needed, such as incentives for energy-intensive tech firms to directly finance renewable projects (as Microsoft is doing).

Also Read: Why the future of space and energy storage might be growing in a Thai hemp farm

One promising idea is direct clean power procurement for AI infrastructure. Instead of buying offsets or generic renewable credits, companies can invest in additional renewable generation that is tied to their data centres. Google has been a leader here, aiming for “24/7 carbon-free” energy by sourcing clean power in every hour and region that its servers operate. Other firms are now looking at similar models, which could drive significant new solar/wind development.

In summary, AI can accelerate the renewable transition — by necessity and by capability. It provides a strong business motive (big tech needs clean power, so they’ll fund it) and new tools (AI to optimise renewable performance). But it also raises the stakes: if renewables don’t scale fast enough, AI will end up entrenching fossil fuel use at exactly the wrong time for the climate.

For corporate leaders, this means aligning AI strategy with energy strategy. Embrace AI projects that further sustainability (smart grid, energy optimisation) and be cautious of AI expansions that outpace your access to green power. Seek partnerships in the energy sector — for example, co-develop a solar farm or wind park that can power your AI workloads. Those who proactively secure clean energy for AI will not only mitigate environmental impact but also hedge against future carbon regulations or fossil price volatility.

Geopolitical and economic crossroads

AI’s energy demands are now a factor on the geopolitical chessboard. Nations are racing to support their tech industries with reliable power (often in competition with climate goals), and energy dependencies are influencing tech policies. Three major theatres highlight this dynamic: the US-China tech competition, Europe’s regulatory balancing act, and emerging markets vying for data centre investments.

The US-China tech war’s energy dimension

China and the United States are both pouring billions into AI, and with that comes a hunger for energy. China has launched an “East Data, West Computing” initiative, investing an estimated US$75 billion to build huge data centre hubs in its inland provinces. Why inland? Because electricity is cheaper there — for example, coal-rich Inner Mongolia offers industrial power rates around US$0.03 per kWh, among the lowest in the world.

By situating AI data centre next to coal plants in the interior, China can fuel its AI growth at low cost (albeit with high emissions). This strategy effectively leverages China’s vast coal infrastructure to gain an edge in computing capacity.

Meanwhile, the US is responding with investments to support AI hotbeds at home. The Department of Energy recently announced US$2 billion for grid upgrades focused on “AI corridors” like Northern Virginia and Ohio. This includes improving transmission and reliability to ensure these regions (where many US cloud data centres cluster) can handle the increased load without blackouts or slowdowns. It’s essentially an infrastructure subsidy to keep US AI development on track and independent of energy bottlenecks.

There’s also a security aspect: both nations view leadership in AI as strategic, so ensuring the energy security of AI facilities is crucial. This could lead to more efforts like backup gas peaker plants for key data centres, or even dedicated small nuclear reactors, to immunise critical AI infrastructure from grid disruptions or fuel supply risks. In a hypothetical future standoff, a country that cannot power its AI systems reliably would be at a serious disadvantage.

Europe’s cautious approach

Europe, in contrast, is trying to chart a path that prioritises sustainability — but at the risk of dampening its AI momentum. The EU’s proposed regulations (like the AI Act) not only address ethics but also efficiency. As noted, the AI Act could effectively delay deployment of power-hungry models (e.g., next-gen GPT) until efficiency targets are met.

Also Read: How we generated 100+ leads on zero budget

Additionally, some European countries have taken hard stances on data centre growth due to energy concerns. Ireland’s moratorium on new Dublin-area data centres, for instance, was driven by fears that the national grid couldn’t meet both climate targets and a surge in data centre demand. That moratorium led companies to shift investments to places like Poland and Norway where power is more available.

The consequence is that Europe risks falling behind in AI infrastructure. While US and China race ahead with massive builds (regardless of carbon cost), Europe’s combination of slower cloud growth and higher energy prices could make it less attractive for AI development.

Some experts warn of a potential “digital drift” where European AI innovation migrates to more energy-abundant shores. On the other hand, Europe’s emphasis on efficiency and green power could pay off in the long run, yielding more sustainable operations that align with global climate imperatives (and avoid future regulatory penalties).

Global energy markets and AI investment

It’s not just the big three (US, China, EU). Around the world, countries are jockeying to attract data centre and AI investments — and energy is the key bargaining chip. For example, countries like Norway, Sweden, and Canada promote their abundant renewable energy (hydropower, wind) and cold climates (natural cooling) as ideal for sustainable AI data centres. Norway has lured several major projects by offering 100 per cent renewable power and low cooling costs, appealing to companies with net-zero commitments.

In Asia, Singapore has imposed a temporary freeze on new data centres due to energy and land constraints, then lifted it in favour of a selective policy favouring the most efficient, green designs. India and Indonesia are pitching themselves as emerging data centre hubs, but they’ll need to rapidly expand grid capacity (and ideally renewables) to deliver on those ambitions.

The energy crisis of 2022 (with spiking fuel prices) was a wake-up call for many: any country that wants to be an AI/cloud hub must ensure cheap, reliable power. This has geopolitical implications: nations rich in clean energy (like Iceland or Quebec with hydro, or Middle Eastern countries with solar + land for data centres) could play a bigger role in the digital economy by hosting energy-intensive AI computation. It’s a new twist on the resource competition of the past — instead of oil or minerals, it’s about attracting “computational industry” with the promise of low-cost electrons.

In summary, leaders need to be aware that AI isn’t happening in a vacuum — it’s intertwined with global energy and policy currents. Decisions about where to site AI operations, which markets to enter, or even which governments to partner with may hinge on energy availability and regulations.

Businesses at the cutting edge of AI should engage in policy discussions: for example, advocating for incentives for clean power or workable regulations that encourage efficiency without stifling innovation.

This is part two of a three-part series exploring AI’s energy impact. Read part one here

Part three of this series looks at the emerging solutions — tech and policy — that could put AI on a more sustainable path, and how companies can harness them.

This article was originally published here and co-authored by Xavier Greco, Founder and CEO of ENSSO.

Editor’s note: e27 aims to foster thought leadership by publishing views from the community. Share your opinion by submitting an article, video, podcast, or infographic.

Join us on InstagramFacebookX, and LinkedIn to stay connected.

Image courtesy: DALL-E

The post The AI-energy paradox: Will AI spark a green energy revolution or deepen the global energy crisis? — Part 2 appeared first on e27.