AI’s Energy Demand Reshapes U.S. Power Landscape

The rise of artificial intelligence (AI) is not merely reshaping industries; it is also redefining the energy landscape. The exponential growth in computing power driving AI advancements has sparked an unprecedented demand for reliable and affordable electricity. As the U.S. strives to maintain technological leadership, meeting the energy needs of AI data centers has become a critical challenge for utilities, regulators, and policymakers.

Just a decade ago, a typical large data center required around 30 MW. Today, new hyperscale facilities regularly demand 100 to 200 MW, with leading operators designing facilities capable of consuming 500 MW to several gigawatts (GW). The scale and speed of this growth are staggering, and the impact on our energy system is difficult to predict. According to the Electric Power Research Institute (EPRI), U.S. data center electricity demand is projected to grow between 3.7% and 15% annually between now and 2030, a fourfold range that underscores the uncertainty of the landscape. Complicating matters further, it’s unclear whether the next generation of AI chips will ultimately dampen or accelerate overall energy demand.

While the demand curve remains uncertain, evolving energy markets are already adopting mechanisms to balance load and supply. Organized wholesale markets optimize resource dispatch across large footprints, generally keeping costs lower than isolated bilateral arrangements. For the grid to evolve to support AI growth, operators need to invest in advanced technology such as Dynamic Line Rating and AI applications for assessing asset health. These markets are eager to tap any and all potential sources of energy, regardless of where it is generated. Expanding the size of the market typically results in more flexibility when it comes to balancing load and supply.

However, markets alone aren’t enough. The grid itself must evolve to meet AI-driven load growth. That means major investment in advanced transmission conductors capable of moving more power through existing corridors, grid-enhancing technologies (GETs) such as dynamic line rating, and voltage management tools to optimize existing conductors. Modern switchgear and power electronics are also crucial for rapidly switching between power sources and grid connections. AI-driven grid applications for predictive planning, faster decision support, and efficiency gains are essential. On-site or nearby generation, including hybrid renewable plants, geothermal, natural gas, and small modular nuclear reactors, as well as load flexibility to shift non-critical AI tasks to off-peak hours or across regions with available capacity, are all part of the solution.

Large AI data centers cannot absorb resources at the expense of local communities. Regulators and operators are working to ensure that costs are distributed fairly while enabling necessary infrastructure investment. A landmark settlement in Indiana earlier this year illustrates the emerging framework. It requires new large loads exceeding 70 MW to sign long-term contracts (up to 12 years, with a ramp-in period), pay significant minimum charges, provide upfront collateral, and adhere to transparent modification terms. The approach ensures utilities can recover investments without burdening existing customers, while providing data center operators with the reliability they need.

Historically, data centers were built near urban hubs or telecom exchanges, with power extended to them. Today, advances in fiber and satellite bandwidth mean that facilities can instead be built where robust power infrastructure already exists, often in less populated areas with easier access to land and water. This siting strategy reduces the burden on constrained urban grids while helping balance regional supply.

Even with good siting, the interconnection process remains a choke point. Bringing new loads onto the grid requires extensive studies as well as upgrades to transmission and substation assets, efforts that can take years. Reforms are underway: FERC Order 2023 has prompted transmission providers, such as SPP, MISO, and CAISO, to streamline processes through initiatives such as the Consolidated Planning Process and Expedited Resource Addition Studies. AI-driven simulation tools are being piloted to accelerate study timelines and improve accuracy in load forecasting, cost estimation, and readiness assessments. Still, lead times for critical equipment remain daunting, with distribution transformers now taking about 30 to 50 weeks to arrive. Large power transformers can take up to 150 weeks. Switchgear often takes a year or more to deliver. Building a more resilient domestic supply chain, with long-term incentives, standardized protocols, and stable demand signals, is crucial.

AI data centers add not just load, but critical applications. Many are landing in territories served by utilities that have expertise constraints. Expanding federal cyber programs, such as DOE’s Rural and Municipal Utility Cybersecurity grants, will be essential to closing these gaps. In parallel, utilities must scrutinize procurement of grid equipment from foreign suppliers to guard against espionage and sabotage risks. As AI becomes integral to defense, finance, and healthcare, securing both physical and digital infrastructure is a matter of national security.

Most mission

Scroll to Top
×