AI Data Centers to Consume 4× More Power by 2035, BloombergNEF Says
Artificial intelligence is becoming the largest new driver of global electricity demand. By 2035, AI data centers could consume over 1,600 terawatt-hours of power annually — four times more than today, according to BloombergNEF forecasts.
AI energy consumption to reach 4.4 % of global power use
The report projects that AI infrastructure — primarily large-scale data centers running GPU clusters — will account for 4.4 % of the world’s total electricity consumption within a decade. That represents a shift comparable to the rise of industrial computing in the early 2000s.
The United States is expected to lead the increase, followed by China and Europe, as tech companies and cloud providers rush to expand AI training capacity.
Efficiency gains can’t keep up with scale
Despite advances in hardware and cooling efficiency, the sheer growth of computational load outpaces energy savings. Each new generation of AI models requires exponentially more processing power for training and deployment, driving up overall consumption.
Analysts note that modern GPUs and AI accelerators consume hundreds of megawatts per facility — roughly the same as a small city.
Gas and renewables in the energy mix
To handle the surge, developers are selecting regions with strong grid infrastructure and favorable energy markets. In the short term, many new AI centers will depend on natural gas for baseload power, while gradually integrating renewable sources such as solar, wind, and hydro.
Energy as the bottleneck of the AI era
The study underscores that energy availability is emerging as the core constraint of AI expansion. Without massive investments in clean generation and transmission, future AI development could face capacity bottlenecks — both economic and environmental.
While AI promises efficiency and automation across industries, its own footprint is quickly becoming one of the most pressing sustainability challenges of the digital age.
Editorial Team — CoinBotLab