Skip to main content

The rise of AI and its impact on the demand for electricity, particularly within data centers, has ushered in a significant challenge for utilities in the United States and the rest of the world. For many years, electricity demand in the U.S. remained relatively stagnant, with utilities seeing little to no growth in consumption. This stagnation was attributed to the mature state of the economy, which could expand without a corresponding increase in energy usage. However, the landscape is rapidly changing due to the explosive growth in data centers, especially those dedicated to AI workloads, which require significantly more power than traditional data centers.

Why does it Matter?

The International Energy Agency (IEA) estimates that data centers accounted for 1–1.3% of global electricity demand in 2022, and that this share is expected to increase to 1.5–3% by 2026. The IEA also notes that data center growth is contributing to local grid connection bottlenecks and water. Data centers could consume between 4.6% and 9.1% of US electricity by 2030, according to an analysis by the Electric Power Research Institute (EPRI).
A traditional Google search uses about 0.3 watt-hours (Wh), while a query using ChatGPT, the chatbot developed by OpenAI, requires around 2.9 Wh. ChatGPT consumes over half a million kilowatts of electricity each day, an amount staggering enough to service about two hundred million requests. ChatGPT’s daily power usage is nearly equal to 180,000 U.S. households, each using about twenty-nine kilowatts. A single ChatGPT conversation uses about fifty centiliters of water, equivalent to one plastic bottle.

The AI Data Center Boom

The sudden surge in demand for electricity can be largely attributed to the advent of generative AI technologies like ChatGPT. This shift has accelerated since the release of GPT-3 in late 2022, which brought AI to the forefront of public consciousness. AI data centers are particularly power-intensive, far exceeding the requirements of their predecessors. The rapid pace at which AI technologies are evolving has left utilities struggling to keep up, as the energy industry traditionally operates on much longer timelines for infrastructure development.

Utilities’ Response to Load Growth

The traditional response from utilities to rising demand has been to build more fossil fuel-based power plants, such as gas plants, or to keep existing ones operational longer than planned. This approach is largely driven by the urgent need to meet the growing energy requirements of data centers. Utilities like Dominion Energy, which serves the largest concentration of data centers in the U.S., have had to drastically revise their load growth projections. For example, in 2021, Dominion was forecasting single-digit growth over 15 years. By 2023, this forecast had shifted to a twofold increase over the same period, a massive change that has put significant pressure on the utility’s planning and infrastructure.
The implications of this sudden demand surge are profound. Utilities operate on long planning cycles, often spanning decades, due to the extensive lead times required for building new infrastructure, obtaining rights of way, and securing regulatory approvals. The rapid growth in electricity demand driven by AI data centers has caught many utilities off guard, forcing them to pivot quickly to avoid potential shortages. However, the construction of new gas plants or the extension of fossil fuel resources contradicts the carbon neutrality commitments made by major tech companies, creating a tension between meeting immediate energy needs and adhering to long-term sustainability goals.

Data Centers and GHG Reporting

Data center usage falls under different scopes depending on how a company utilizes the data center and who owns or operates the facility. Companies now have to report on their AI and data center usage.

  • Scope 1: This includes direct emissions from sources that are owned or controlled by the company. Data center usage typically doesn’t fall under Scope 1 unless the company owns the data center and the associated infrastructure, including power generation on-site (e.g., backup diesel generators).
  • Scope 2: This includes indirect emissions from the consumption of purchased electricity, steam, heating, and cooling. If a company operates its own data centers and purchases electricity to power them, the emissions from that electricity use would be classified as Scope 2 emissions.
  • Scope 3: This includes all other indirect emissions that occur in the value chain of the reporting company, including both upstream and downstream emissions. If a company uses third-party hyperscalers or cloud services (like AWS, Google Cloud, or Microsoft Azure), the emissions associated with the electricity used to power these data centers would be classified as Scope 3 emissions for the company, since the data centers are owned and operated by another entity.


The Politics and Economics of AI Data Centers

The politics surrounding AI data centers are complex, particularly regarding their impact on net-zero commitments. Tech companies have been at the forefront of decarbonization efforts, investing heavily in renewable energy projects. However, the recent growth in AI workloads has outpaced the expansion of green energy infrastructure. As a result, utilities are increasingly relying on fossil fuels to meet the new demand, which undermines the tech companies’ environmental commitments.

The competition for energy resources is also intensifying, raising concerns about a potential “zero-sum game” where the growing power needs of AI data centers could lead to shortages or higher costs for other consumers. This situation has already played out in various parts of the world, such as Dublin, Singapore, and Amsterdam, where governments have imposed restrictions on data center expansion to preserve energy resources for other uses. In the U.S., this scenario has yet to fully materialize, but it remains a looming threat as the demand for AI-driven computing power continues to grow.

Moreover, the competitive pressure within the tech industry to develop and deploy AI technologies rapidly has led to a situation where companies are prioritizing immediate energy needs over long-term sustainability. This urgency is driven by the fear of being outpaced by rivals, which could lead to massive financial losses or even obsolescence. As a result, some companies may resort to less sustainable energy sources, such as coal, to ensure they can continue to operate at full capacity.


Technological Solutions and Industry Adaptation

Despite the challenges, there are potential solutions that could help balance the growing energy demands of AI data centers with the need for sustainability. One approach is to optimize the use of existing infrastructure through grid-enhancing technologies, such as dynamic line ratings and advanced sensors, which can increase the capacity of power lines and improve overall grid efficiency. Additionally, energy storage solutions, such as batteries, can help manage peak demand periods, reducing the need for new fossil fuel-based power plants.

Data centers themselves can also play a role in addressing these challenges by becoming more flexible in their energy usage. Many data centers are already equipped with backup generation and storage capabilities, which could be leveraged to support the grid during peak times. By optimizing software and hardware operations, data centers can reduce their overall energy consumption and contribute to grid stability.

Another potential avenue for addressing the energy needs of AI data centers is the development of new power generation technologies, such as small modular reactors (SMRs) or even fusion energy. However, these technologies are still in the experimental stage and are unlikely to provide a solution within the next decade. In the short term, the focus will need to be on making the most of existing resources and infrastructure while continuing to invest in the development of cleaner energy sources.


The Future of AI Data Centers and Utilities

The future of AI data centers and their impact on utilities is uncertain, with many variables at play. The speed at which AI technologies are advancing has created a situation where both tech companies and utilities are struggling to keep pace. This has led to a disconnect between the rapid growth in demand for electricity and the slower-moving infrastructure development needed to support it.

The potential for overbuilding in the data center market is another concern. As tech companies rush to secure the resources they need to support AI workloads, there is a risk that they may invest heavily in infrastructure that may not be fully utilized if demand projections are overstated. This could lead to inefficiencies and financial losses, particularly if advances in semiconductor technology reduce the power requirements of AI servers in the future.

Furthermore, the reliance on third-party data centers by hyperscalers like Microsoft, Amazon, and Google has shifted the landscape of the industry. These companies are now more dependent on external providers for their data center needs, which adds another layer of complexity to the equation. The location of these data centers is increasingly determined by access to power rather than proximity to major communication hubs, further complicating the planning and development process.


Conclusion

The intersection of AI, data centers, and utilities presents a complex and rapidly evolving challenge. The explosive growth in demand for AI-driven computing power has put unprecedented pressure on the energy sector, forcing utilities to rethink their strategies and adapt to a new era of load growth. While there are potential solutions, such as optimizing existing infrastructure and investing in new technologies, the path forward is fraught with uncertainty.

Tech companies will need to balance their ambitions for AI development with their commitments to sustainability, while utilities must find ways to meet the growing demand for electricity without compromising the stability of the grid or the environment. As the industry navigates this new landscape, the decisions made today will have far-reaching implications for the future of both AI and the energy sector.

Leave a Reply