Mannat Chopra, equity analyst: Global Resource Equities, and Alex Monk, portfolio manager: Global Resource Equities at Schroders, discuss how Artificial Intelligence (AI) is driving a major increase in the demand for energy and the importance of keeping up with the growing demand for power from data centres.
AI is driving a major increase in the demand for energy. The data centres that provide the computational and storage capabilities necessary to develop, train, and deploy AI models will consume much more energy as AI applications become more widely used. Meeting this escalating demand for power presents numerous challenges. That may explain why Sam Altman, the chief executive of OpenAI, has described energy as the “hardest part” of satisfying the compute demand of AI.
Data centres were using more and more energy even before the explosion of interest in the capabilities of generative AI. From 2012 to 2023, the demand for power from data centres increased at a compound annual growth rate (CAGR) of 14%, far surpassing the 2.5% growth in the total demand for electricity over the same period. Much of that growth in data centres’ use of electricity in that decade was fuelled by increased data generation and analysis and by the shift to cloud computing, as companies moved their data storage and computing power off their premises and to data centres.
Now AI will even more dramatically increase the power needed by data centres. When AI models are in the training phase, learning to make predictions and decisions based on data they have been fed, they use six times more energy than non-AI computational uses. In the “inference phase”, when trained AI models are drawing conclusions from new data and queries, they still consume two to three times more energy than traditional workloads.
A major increase in capacity need to power AI
To keep up with the growing demand for power from data centres, there is a need to significantly increase the world’s ability to generate and transmit power.
The world’s largest technology companies are spending billions of dollars to add critical power capacity to increase their ability to train and develop AI models. These firms – known as “hyper-scalers” because of their ability to scale computing infrastructure to levels that can accommodate the massive demand for cloud computing services, data storage and now AI capabilities – include Google, Microsoft, Amazon, Meta (Facebook), Apple, and Alibaba.
The research firm SemiAnalysis estimates critical IT power capacity – that is the total power available to run servers, storage devices, and network equipment (and apart from non-IT uses such as lighting and cooling) – at data centres globally will increase from 49 000 megawatts in 2023 to 96 000 megawatts by 2026.
Use Schroders table 1 here
That increase in critical power capacity constitutes a 25% CAGR over the next three years, again far surpassing the annual growth rate of 13% seen from 2014 to 2023. AI workloads will constitute 85% of that future growth. As Figure 2 illustrates, many of the hyper-scalers will come close to doubling their capacity.
Use Schroders table 2 here
The increased capacity globally will come not only from efficiency gains and expansions at existing centres but also from the construction of new data centres.
A commitment to sustainable electricity consumption
Renewable energy sources such as wind and solar will play a critical role in meeting the increased demand for computing power, as countries work toward the Paris Agreement’s targets for reducing greenhouse gas emissions.
The Western hyper-scalers also have their own ambitious decarbonization goals.
- Google is aiming to use only carbon-free energy on a 24/7 basis by 2030.
- Amazon plans to power its operations with 100% renewable energy by 2025. It is also aiming to reach net-zero carbon emissions by 2040.
- Meta (Facebook) has reduced the greenhouse gases emitted from its operations by 94% since 2017. It has done so primarily by powering its data centres and offices with 100% renewable energy.
- Microsoft is aiming to match all its electricity consumption with zero-carbon energy purchases by 2030. It is also planning to remove, by 2050, all the carbon it has ever emitted since the company was founded in 1975.
- Apple now operates all its stores, data centres, and offices worldwide with 100% renewable electricity. About 90% of this comes from renewable sources that Apple has created. It has achieved this through long-term power purchase agreements (PPAs) with some renewable power plants and equity investments in, or direct ownership of, other renewable energy facilities.
The intermittence of solar and wind energy poses a challenge
Data centres are power hungry and operate 24/7. Given that the wind and sun are intermittent sources of energy, it has become apparent that data centres today cannot be directly powered by renewables alone, even when battery technology is used to store generated power. (Batteries also present their own challenges given their costs, limited lifespans, and inefficiencies.)
The hyper-scalers have been addressing this problem by signing virtual PPAs with renewable developers, whose energy is fed into the electrical grid, which still draws much of its power from plants fired by coal or natural gas.
The hyper-scalers’ data centres are powered by green and grey (sourced from fossil fuels) electrons. When the cost of electricity from the PPA is higher than the cost from the grid, the hyper-scalers are paying the difference. When the cost in the PPA is lower, they’re realising the savings.
Hydro or nuclear power could provide an alternative to the reliance on fossil fuels. But there are geographical constraints with hydropower. Nuclear plants have additional issues, ranging from the long time it takes to build them to public resistance to nuclear sites. For now, natural gas offers the most viable option for obtaining power to supplement renewable sources, given that it can deliver energy on demand and is a much cleaner alternative than coal-fired plants.
Multiple bottlenecks to the build-out of additional capacity
Increasing power generation and transmission capacity in a timely way, while also managing the broader stability of electrical grids, has been a challenge that could slow the build-out of data centres and the proliferation of AI-enabled solutions. Multiple additional bottlenecks have emerged.
First, the existing build-out of data centres is having a negative impact on grid networks. That has caused some data centres’ operators to pause new additions. In Ireland, where data centres now use 18% of the electricity generated in the country, no new centres can be connected to the power grid until 2028. The Netherlands has restricted the construction of new centres to two locations, and Singapore has put a four-year moratorium on new data centres construction.
Second, scaling supply chains to match the hyper-scalers’ lofty ambitions is proving to be challenging. There is a shortage of transformers, the large, complex pieces of equipment that adjust the voltage of electricity so that it can be transmitted over long distances and be used at levels that are safe for data centres.
Wood Mackenzie, a provider of data analytics for the renewable energy sector, estimates that it now takes two years to obtain a transformer, up from only one at the start of 2022. Given that this challenge requires a scaling up of production and not a technological breakthrough, it might remain as only a near-term bottleneck.
Third, connecting renewable power generation to the electrical grid is also taking longer because of growing grid connection queues. In the US, for example, it now takes four years to assess a new renewable power plant’s impact on the grid. New plants also require new power lines to carry electricity from where it is generated to where it is used.
The timelines for adding transmission lines are lengthy, as well. In total, given the three to four years needed for siting and permitting a new project and three or four more years for construction, the entire process of bringing a renewable power plant online can take as much as six to eight years.
Unlike supply chain issues, longstanding bureaucratic delays can be resolved only with government-led action. Given the time it takes to realise change on that front, it seems likely these bottlenecks will persist and continue to put limits on capacity growth.
In response to these challenges, hyper-scalers are finding alternative solutions. One option is to acquire a captive “off-grid” source of power. Amazon recently did exactly this when it bought a data centre in Pennsylvania that gets its energy from a nearby nuclear power station.
AI may help solve the problem it’s creating
Perhaps not surprisingly, AI could help solve many of the challenges associated with delivering the increased energy it requires. With AI in the early stages of development, it is too early to predict exactly how this scenario will play out. Still, it seems highly likely that AI will help with the discovery of ways to manage and use power more efficiently and effectively.
Disclaimer: The views expressed in this article are those of the writers and are not necessarily shared by Moonstone Information Refinery or its sister companies.