AI Efficiency Surge Echoes Lighting Revolution, Set to Boost Power Demand

The adoption of efficient lighting in the late 20th century offers a compelling blueprint to understand the potential trajectory of AI’s future. When energy-efficient lightbulbs first hit the market, conventional wisdom expected a dramatic reduction in electricity consumption. However, the reality was more complex. The increased efficiency and lower costs of these bulbs spurred widespread adoption, unlocking new applications and ultimately increasing power consumption and infrastructure needs. Today, we stand on the precipice of a similar revolution with AI.

Startups like China’s DeepSeek are achieving breakthroughs in AI efficiency, leading some to anticipate a decline in infrastructure needs. But history suggests a different outcome. As AI becomes more affordable and efficient, adoption is likely to surge, generating novel use cases and driving unprecedented demand for computing power and infrastructure. This rapidly unfolding demand for AI is poised to drive a fundamental shift in power infrastructure planning, compelling utilities to accelerate grid modernization and power generation capacity.

This transformation has only begun to take shape, creating opportunities across the ecosystem. Data center developers, power producers, and suppliers of electrical and cooling components are all poised to benefit. For investors, ETFs such as the Global X Artificial Intelligence and Technology ETF (AIQ), the Global X Data Center and Digital Infrastructure ETF (DTCR), and the Global X U.S. Electrification ETF (ZAP) present potential avenues to capture this transformation.

The AI boom is putting U.S. data centers on track to potentially consume 12% of U.S. electricity by 2028. After two decades of flat electricity growth, utilities could be tasked with meeting up to 47% higher demand by 2040. This underscores the urgency for grid upgrades and power generation. Decarbonizing the grid while modernizing the power infrastructure will require substantial investments, spurring innovation in areas such as nuclear power and energy storage systems.

The global economy, led by the U.S. technology industry, is rapidly transitioning from the Information Age to the Automation Age. In this new era, machines, software, and systems no longer just process data but act on it autonomously. The Automation Age promises a boom in efficiency and productivity, powered by smart, agile, and accessible AI. However, this progress comes with a significant energy cost. The technology industry is rushing to train, test, and deploy AI, and to manufacture the semiconductors necessary for AI.

Training foundational AI models exemplifies the power cost of AI. OpenAI used approximately 50 gigawatt hours (GWh) of electricity to train GPT-4—enough to power 6,000 U.S. homes for an entire year and fifty times more electricity than it took to train GPT-3. Since GPT-4’s public release in March 2023, infrastructure demands have intensified as companies deploy increasingly larger AI-GPU clusters to train next-generation models. Meta Platforms plans to invest at least $60 billion in capital expenditure in 2025, expecting to operate a total of 1.3 million GPUs by the end of the year. xAI plans to spend $35–40 billion to grow its Colossus supercomputer to operate on 1 million GPUs. Microsoft plans to spend $80 billion on AI infrastructure in fiscal 2025.

Individual GPUs are becoming more power-hungry as well. Nvidia’s Blackwell (GB200) chip, although significantly more power-efficient, is designed for nearly seven times the power draw of the A100 chips used to train GPT-3. By 2030, U.S. data centers could house millions of such advanced GPUs, also requiring significant energy for cooling.

Using AI applications is also energy-intensive. A single ChatGPT query can consume 10 times more energy than a Google Search, enough electricity to power a light bulb for 20 minutes. More complex tasks like generating videos or high-quality AI images can require hundreds of times more energy. One minute of interaction with an AI voice assistant could consume up to 20 times the energy of a traditional phone call. ChatGPT’s 180 million monthly users showcase today’s AI power demands, but the real surge is expected to come from agentic AI models interacting with each other. By 2030, the agentic AI market is projected to reach $47 billion, with billions of AI agents working autonomously on human-directed tasks. Ant

Scroll to Top
×