AI Infra: Demand Siting, Capacity Planning, and Governance
- Sakshi Pawar
- 3 days ago
- 5 min read
Updated: 22 hours ago
Relevant articles referenced:
https://think.ing.com/articles/how-data-centres-can-be-better-integrated-into-the-energy-ecosystem/
https://nicholasinstitute.duke.edu/publications/rethinking-load-growth
https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/
https://e360.yale.edu/features/artificial-intelligence-climate-energy-emissions
Podcast - https://www.eli.org/podcasts (Data Centers, AI, and the Grid: Can Load Flexibility Unlock New Capacity?)
Artificial intelligence usage has increased rapidly since 2022 following the deployment of large language models and other foundation models across consumer-facing and enterprise applications. These systems are now integrated into search, productivity software, customer service tools, logistics, data analysis, and public-sector workflows. Unlike earlier digital applications, many AI systems operate continuously rather than intermittently.
This expansion has increased demand for data center services. AI workloads are hosted in large data centers designed to operate at high utilization and high reliability. While data centers support many digital services beyond AI, AI-related workloads are a growing contributor to additional electricity demand, particularly through inference workloads that must remain available at all times.
Global data center electricity consumption is projected to approach 1,000 terawatt-hours per year, a level comparable to the total annual electricity consumption of Japan. At this scale, data center electricity use intersects with electricity system planning, capacity adequacy, transmission investment, and resource regulation. Electricity demand of this magnitude is typically addressed through regional or national planning frameworks rather than treated as marginal commercial load.
Energy efficiency has improved through advances in hardware, software, and data center design. Electricity use per computation has declined. However, total electricity use continues to rise as AI deployment expands across more services and users. For governance and regulation, the relevant measure is total electricity use rather than efficiency at the level of individual tasks.

Claimed benefits and potential upside
AI is frequently described as an efficiency-enabling technology. Identified applications include electricity load forecasting, building energy management, logistics optimization, industrial process control, and climate and weather modeling. In these applications, AI can improve operational efficiency and reduce resource waste.
Data center infrastructure has also become more energy-efficient. Large-scale facilities typically achieve lower power usage effectiveness than smaller data centers. Specialized processors deliver higher computational output per unit of electricity than general-purpose hardware. Despite rapid growth in internet traffic over the past decade, the share of global electricity consumption attributable to data centers has increased more slowly than earlier projections anticipated.
A related financial argument concerns electricity procurement. Large technology firms increasingly use long-term power purchase agreements to secure electricity supply, often from renewable generation. These contracts provide revenue certainty for generation projects and influence investment decisions in clean energy markets. (Large technology firms, including Google and Microsoft, use long-term renewable power purchase agreements as part of their data-center electricity sourcing strategies.) At the same time, electricity procurement is increasingly structured through virtual PPAs, allowing firms to meet sustainability targets without necessarily changing where load is drawn from the grid.
Electricity use, water use, and system impacts
Total electricity use increases as AI deployment expands, even as energy use per computation declines. This reflects increased frequency of use, broader deployment, and continuous operation of AI-enabled services. This raises the first concern of capacity - For example, if rooftop solar had to provide energy for hyperscale infrastructure, about 50-100 MW of energy you would need thousands of residential rooftops, or even hundreds of commercial rooftops. So some types of clean energy, even barring associated concerns with intermittence, may not be compatible with AI infra. Inter- connection of renewables to the grids was already a point of concern. The grid was designed for stable, centralized power sources, not for variable and decentralized inputs from renewables. The introduction of AI hyperscale infra, requires a transformation of the grid infrastructure to accommodate renewable energy, which includes upgrading transmission lines and enhancing grid management technologies.
The grid also experiences demand unevenly with these new technologies. Large data centers draw significant electricity within specific grid regions. These facilities require high reliability and operate continuously, which affects how demand appears within regional electricity systems. In response to grid access constraints and interconnection timelines, some operators increasingly rely on onsite generation or hybrid grid–onsite configurations, particularly for interim capacity. These arrangements do not eliminate reliance on the grid but change how and when grid power is used.
Water use is a separate but related issue. Many data centers rely on freshwater for cooling. Water withdrawals are local and depend on regional water availability. In some locations, data center water use has overlapped with municipal and agricultural demand, leading to public concern and legal disputes. Changes in water use are more immediately visible at the local level than changes in electricity supply.
Across both electricity and water, disclosure is limited. Companies do not consistently report AI-specific electricity or water consumption. This limits the ability of regulators, utilities, and communities to assess impacts in advance.
Location of use and location of load
AI-enabled services are used broadly across households, offices, and institutions. The electricity required to operate large data centers is drawn from the grid in the locations where those facilities are sited. This creates a separation between where services are consumed and where electricity demand is physically concentrated.
Electricity systems are regional and regulated. When large data centers are located within a specific service territory, electricity demand appears within that region’s load and pricing structure.
For example, New Jersey, which is part of PJM Interconnection, the regional grid operator serving much of the eastern United States and a major data center region, saw retail electricity prices increase by approximately 20 percent year over year. Electricity price movements reflect multiple factors. This example illustrates how price changes are observed within regional electricity markets where large energy-intensive facilities are located, even though AI services are used broadly beyond state boundaries.
Law, finance, and policy considerations
From a policy perspective, electricity planning frameworks are generally based on assumptions of incremental load growth and long planning horizons. Large data centers can introduce significant new electricity demand within short timeframes, driven by private investment decisions.
From a finance perspective, electricity infrastructure is typically financed through regulated rates or market mechanisms that recover costs from customers within a service territory. Power purchase agreements allow firms to manage electricity price exposure but do not determine how local grid infrastructure costs are allocated. As electricity procurement strategies evolve toward long-term and virtual contracting, the relationship between financial responsibility and physical system impact becomes less direct.
From a law and governance perspective, authority is fragmented across sectors and jurisdictions. Electricity regulation, water regulation, land-use permitting, and digital governance are handled by different institutions. In the European Union, the AI Act introduces lifecycle reporting requirements for high-risk AI systems, including energy and resource use. In the United States, proposed federal legislation would require government assessment of AI’s environmental footprint, while electricity regulation remains primarily state-based. Separately, international standards bodies are developing criteria for “sustainable AI,” including metrics for energy efficiency and water use. These legal and regulatory efforts signal growing attention but remain only partially connected to electricity system governance.
Addressing the issues
There is need for integration rather than limitation. Standardized reporting of AI-related electricity and water use would improve transparency for regulators and planners.
Clearer cost-recovery rules and planning processes could improve alignment between private infrastructure investment and regulated electricity systems. Greater attention to how onsite generation, hybrid supply models, and long-term contracts interact with grid planning would reduce uncertainty for both utilities and investors.
Early disclosure and engagement with local stakeholders is relevant in regions where water use or land impacts are significant.
Artificial intelligence now operates at a scale that directly intersects with electricity systems, water resources, and regulated infrastructure. Its electricity use approaches levels associated with national systems, while services are consumed broadly across jurisdictions. At the same time, procurement strategies, onsite generation, and evolving legal frameworks are reshaping how AI infrastructure interacts with public systems. Addressing these challenges requires clear separation of issues, accurate reporting, and coordination across law, policy, and finance without relying on assumptions about causality or intent.



Comments