Artificial intelligence (AI) is rapidly emerging as a cornerstone of modern productivity and economic growth, fundamentally transforming how businesses operate, how investments are made, and how work is done. With the potential to significantly accelerate the pace of global economic expansion, AI technologies are poised to become even more integrated into daily life and industry. However, this transformation comes with a major challenge: energy consumption.
According to scenarios outlined in the International Monetary Fund’s (IMF) April 2025 World Economic Outlook, AI has the potential to raise the average pace of global economic growth in the coming years. The increased efficiency and innovation it brings can unlock new economic frontiers. But behind the algorithms and neural networks lies an often-overlooked reality: the vast power required to support the infrastructure of AI.
The Energy Demands of AI
The backbone of artificial intelligence lies in powerful data centres—warehouses filled with servers that process, store, and analyse massive volumes of data. These facilities are essential for training AI models, hosting cloud services, and supporting everything from personalised search engines to autonomous vehicles. But as their role grows, so does their appetite for electricity.

In 2023 alone, global data centres consumed as much as 500 terawatt-hours (TWh) of electricity, according to the most recent estimates by the Organization of the Petroleum Exporting Countries (OPEC).
This figure is staggering—it’s more than double the levels seen annually from 2015 to 2019. Even more striking, OPEC projects this number could triple to 1,500 TWh by 2030.
To put this into context, the electricity currently used by data centres is roughly equivalent to the entire power consumption of countries like Germany or France.
If trends continue, by 2030, these facilities could consume as much electricity as India, the third-largest electricity user in the world. Moreover, their energy use would surpass that of electric vehicles (EVs), consuming 1.5 times more electricity than all EVs globally by the end of the decade.
U.S. Leads the Surge
Nowhere is this energy surge more evident than in the United States, home to the world’s largest concentration of data centres. The U.S. is expected to see the fastest growth in AI-related power consumption. In a medium-demand scenario, McKinsey & Company projects that electricity use by U.S. server farms could more than triple by 2030, reaching over 600 TWh.
This rapid growth reflects an ongoing boom in constructing data infrastructure. Companies are racing to build new facilities capable of storing data in the cloud, delivering real-time AI-powered services, and meeting an ever-growing demand for digital performance. While this is a sign of economic vitality, it also signals growing stress on the nation’s energy grid.
Policy Challenges and the Risk of Energy Shortages
The unprecedented rise in electricity consumption from the AI and tech sectors is placing new pressures on energy markets and public policy. Governments and energy providers now face the challenge of scaling electricity production quickly and sustainably.
If electricity supply can expand responsively to meet the growing demand, power prices may only experience moderate increases. However, if supply growth is sluggish or constrained by infrastructure, policy, or resource limitations, the result could be sharp electricity price spikes.
These would not only impact households and industries but also risk slowing the momentum of the AI sector itself due to higher operating costs.
The stakes are high. Without proactive planning, electricity shortages or skyrocketing prices could act as a brake on technological innovation, economic productivity, and consumer affordability.
Climate Considerations: Emissions from AI
Beyond economic risks, there are serious environmental implications. The growing hunger for electricity from AI systems may substantially increase global greenhouse gas emissions. Under current energy policies, the AI-driven rise in electricity demand could result in an additional 1.7 gigatons of carbon dioxide emissions globally between 2025 and 2030.
That’s roughly equivalent to Italy’s total energy-related emissions over a five-year period.
This presents a critical challenge for countries that have pledged to meet climate targets. AI innovation must be coupled with clean energy development—whether through renewables, nuclear power, or other low-emissions technologies—to ensure progress is not undermined by an expanding carbon footprint.
Efficiency vs. Expansion: Uncertainty Ahead
A major unknown in projecting AI’s future energy demand is how technology will evolve. Some AI models are becoming more energy-efficient. Open-source models like DeepSeek, for instance, are designed to lower computing costs and reduce electricity usage. These innovations can partially offset some of the growth in power consumption.
However, there’s a paradox at play: the cheaper and more efficient AI becomes, the more it is used. As barriers to entry fall, demand for AI-powered tools grows across sectors—from healthcare to finance to entertainment. At the same time, more advanced models, especially those capable of deep reasoning or handling complex queries, are significantly more energy-intensive.
This creates a feedback loop that is difficult to predict. Will the efficiency gains of new models outpace the growth in usage? Or will the expansion of compute-heavy AI applications overwhelm these improvements?
What Policymakers and Businesses Can Do
To manage the risks and maximise the benefits of AI’s energy transformation, collaboration between governments, energy providers, and the private sector is essential. A coordinated approach can help ensure that infrastructure investments align with emerging demand patterns and that energy sources diversify fast enough to prevent volatility.
Key strategies include:
Incentivizing investment in renewable energy and nuclear power to expand the grid sustainably.
Developing energy-efficient AI models and hardware, such as advanced chips optimised for lower power usage.
Creating regulatory frameworks that anticipate surging data center construction and grid requirements.
Encouraging demand-side efficiency measures, such as locating data centers in cooler climates or near renewable sources.
A Balancing Act
AI is undoubtedly a catalyst for global economic transformation. But its rise brings new complexities—especially in energy consumption and environmental impact. Balancing the promise of AI with the practicalities of electricity generation and climate responsibility will define the next chapter of technological advancement.
With smart policy, collaborative innovation, and strategic investment, the world can ensure that AI fulfills its immense potential—without short-circuiting the power grids or the planet in the process.

