
Powering the US Data Center Boom: Why Forecasting Can Be So Tricky
How much electricity will the U.S. need to support new data centers popping up around the country? Spurred by the rise of advanced artificial intelligence technologies (including generative AI models like ChatGPT), an ever-increasing demand for cloud computing services and a surge of investment worth hundreds of billions of dollars, the data center industry is only expected to grow. But answering this question is not so easy.
We already know that data centers need a lot of electricity to operate. Data centers can be single rooms or massive facilities spanning hundreds of acres to house the physical computing and network equipment needed to provide data processing and storage capacity. The centers also include resource-intensive cooling systems, backup power generation, fire suppression and security. The more power capacity the data center has, the more data the center can process simultaneously.
Many articles and reports published over the past few years estimate that the amount of electricity needed could overwhelm our current power system. For example, one review by Rystad Energy found that the U.S. has over 100 gigawatts (GW) of data center demand coming online between 2024 and 2035. For comparison, that’s about 10 times New York City’s summer peak demand in 2023, when air conditioners were operating at full blast. Other reports, however, paint a hazier picture. An Electric Power Research Institute paper from 2024, for instance, found that electricity demand for data centers could consume anywhere between 4.6% and 9.1% of all U.S. electricity consumption by 2030. The difference between those figures, around 200 terawatt-hours (TWh), is equivalent to the energy consumption of almost 11 million homes.

The consequences of this uncertainty could be massive. If not managed properly, this unfettered growth could lead to higher energy bills for consumers, increased greenhouse emissions and a less reliable energy system. Utilities across the country are already seeking rate increases in response to new data center demand coupled with more frequent weather events. While some efforts are being made to isolate data-center related costs, homes and businesses could end up stuck with extra costs from overbuilt, unnecessary or underutilized infrastructure. Furthermore, plans to expand natural gas generation and delay coal plant retirement to support data center demand could lock in greenhouse gas emissions for decades, even if demand fails to materialize.
Here we dive deeper into what’s causing these extreme variations in projections and how local policymakers, regulators and utilities can prepare for uncertainties in electricity demand.
Data Centers: Past and Future Growth
The data center industry is no stranger to rapid growth in demand. During the 2000s, the rise of Internet services caused explosive growth in development, with data center electricity use rising 90% between 2000 and 2005 and 24% between 2005 and 2009. However, between 2010 to 2018 global data center electricity use was basically flat, as more efficient technologies were introduced even as computing demand soared across the world. In hindsight, electricity demand was overestimated, with premonitions of power shortages echoing current concerns.
Today, there is nearly universal agreement that the era of flat data center energy use is over. But there is significantly less consensus on how much data center electricity demand will increase over the next decade. Modeled energy use projections through 2030 range from 200 TWh/year to over 1,050 TWh/year. That highest figure, published by Boston Consulting Group, would represent about a quarter of all U.S. electricity generation in 2023 (which was 4,178 TWh). Conversely, one study found no evidence of national electricity demand growth, but specific regional and utility demands are expected to increase.
Many estimates, however, put data center energy use between 300 TWh/year and 400 TWh/year by 2030 (that’s a significant figure, equivalent to 53% to 71% of all of Texas’ net electricity generation for 2024). Modeling sensitivities can also cause estimates to vary within studies, as well. The Lawerence Berkeley National Laboratory’s study on data center demand, for example, estimates that data centers will consume anywhere between 325 TWh and 580 TWh of electricity per year by 2030, representing 6.7% to 12% of all U.S. electricity consumption.
The Difficulty Calculating Future Data Center Energy Usage
Why do these forecasts differ so wildly? There are several reasons to be discerning when considering estimates of future of data center demand. They include:
Market Forces and Impacts
Some experts believe that because of the rapidly growing market, utilities are being flooded with “speculative” interconnection requests from data center developers. This could be from submitting requests for early-phase projects that are unlikely to be completed, putting in multiple requests for the same facility with one utility or filing multiple requests for the same project in different utility territories. This is fueled by the low cost of requesting a grid connection and the desire to gauge how quickly a project could connect to the grid and begin operating. According to experts, this practice is leading to both double counting of projects and “phantom” load that will never be built, distorting load forecasts and utility resource planning processes.
Supply side issues may also undercut projections. Power availability has already emerged as a limiting factor for many data center developers, with one analysis finding that power constraints were extending data center construction timelines by 24 to 72 months. Shortages of infrastructure components like transformers, switchgears and gas turbines are only compounding this issue. On the computing end, a study from London Economics found that the U.S. would need to purchase 90% of global semiconductor manufacturing output over the next five years to support all of the data center load announced to be online by 2030.
Larger market factors make the medium- to long-term future of data center investment uncertain. The market potential for AI applications driving much of the current data center boom is still unknown. While many businesses have reported increased use and integration of generative AI technologies, other analysts and business leaders have seen signs of a bubble emerging in AI applications and associated data center construction. Some investments have already seen setbacks — OpenAI and SoftBank’s Project Stargate, announced in January and set to spend $500 billion in new AI-focused data centers by 2029, has seen a slower than expected start and has only announced plans for one small data center to open by the end of this year. Large economic shocks like tariffs, conflict in the Middle East and rollbacks in clean energy and manufacturing investments also threaten wider economic stability underpinning corporate data center investment.
The Promise of More Efficiencies
Changes to data center technologies also drive variability in electricity demand estimates. Just as efficiency improvements kept data center energy use flat in the late 2010’s, the rate of new efficiency gains in both hardware and software could cause faster or slower energy growth than current predictions. Manufacturers like Nvidia and Arm have been racing to improve their hardware’s power efficiency and keep up with new demands for AI-optimized infrastructure. Meanwhile, new developments in software and algorithmic performance hold significant promise for maximizing performance out of existing hardware. Certain AI models like DeepSeek and z.ai have already proven that major software efficiency gains are possible.
Additionally, improvements in efficiency of non-computing systems can lead to lower overall energy consumption. Cooling, in particular, can account for 40% of a data center’s energy usage, making it a prime target for greater efficiency. However, it’s important to note that these efficiency gains are far from guaranteed, and experts have warned that even with improvements, the industry might experience a “rebound effect” that ultimately drives demand higher.
Modeling and Forecasting Differences
At least some of the national forecast differences are caused by organizations using different statistical models and methodologies at different times; these differ from utility forecasts. Rigorous models take a “bottom-up” approach, which builds an electricity demand assumption using data on individual components, such as the power draw of servers and a facility’s overall efficiency. This is combined with the currently installed base of servers to estimate current electricity use. Future electricity demand is projected by adding shipments of equipment and expected trends. Researchers using bottom-up modeling can still use different inputs and assumptions and are not always transparent, resulting in significantly varied results.
“Top-down” approaches, which are used infrequently, on the other hand, combine estimates of energy consumption within a geography and demand for data center services to project the amount of physical infrastructure needed to meet that demand.
However, despite modeling differences, estimates of future energy demand appear to be rising over time. The consulting firm Grid Strategies found that the total amount of five-year future summer peak demand growth forecasts published by utilities went from 38 GW in 2023 to 128 GW in 2024.
How Can Policymakers Manage Uncertainties in Their Jurisdictions?
It’s clear that the future of the U.S. electric system will be shaped in part by data center growth. But, while the estimates are often discussed nationally, it is regional authorities, states and local governments that will be responsible for most of the oversight and management of data center impacts on the ground. Planning for data center energy demand at these levels is tied to specific load interconnection requests from data centers, which are analyzed using internal models and assumptions. Decisions must be made now on building generation, transmission and distribution infrastructure, which are expected to endure for many decades (potentially longer than the useful life of new data centers) and will ultimately be paid for by customers.
The ways these policymakers respond will have huge effects on climate, the grid and household energy bills. We’ve already seen at the federal level how the Trump administration has used national estimates of AI data center demand to justify increased investment in coal production and force multiple fossil fuel plants to continue operating after their planned retirement dates.
As the frontlines of energy system planning, regional, state and local decisionmakers can take the learnings from these nationally-focused data center load forecasts to better prepare for energy demand growth from data centers in their jurisdictions.
Here are three ways they can navigate new data center development:
1) Improve Transparency in Reporting on Large Load Interconnection Requests
For all the discourse around data center growth, concrete information on both actual and expected energy use from data centers can be hard to come by. Requests to interconnect a facility to the electricity system are considered proprietary information and not made public. This is especially true for data centers where energy use is tied directly to their computing power and business development. But policymakers can take steps to make information on data center energy use and requests more transparent.
One option would be to require utilities to regularly report large loads in their interconnection queue. Since the first quarter of 2024, Georgia Power has published regular quarterly Large Load Economic Development Reports as part of its compliance agreement under its 2023 Integrated Resource Plan Update. The reports require Georgia Power to publicly disclose the size, review status, proposed service date, and load ramp schedule of all the commercial large load projects over 115 megawatts (MW) seeking to contract with Georgia Power. In doing so, Georgia regulators, stakeholders and policymakers have access to accurate and timely information about large amounts of energy demand in their state.
Another policy option would be to institute a mechanism for proactive stakeholder engagement. In Virginia, House Bill 1601 would have required utilities to inform local governments about any new or existing substations and the anticipated transmission voltage needed to serve any proposed large load facilities over 100 MW.
2) Require Utilities and Grid Planners to Adopt Electricity Demand Forecasting Methods that Account for Uncertainty
Accurate forecasting of future energy demand is critical to making informed investment, infrastructure and policy decisions. While the rapid growth of data centers across the country has thrown a wrench into recent load growth projections, utilities and grid planners have no other option but to continue making energy demand predictions. Fortunately, methods for evaluating different large electricity and data center demand requests uncertainty have begun to emerge. For example, in response to massive year-over-year increases from utilities, Texas grid regulator ERCOT developed a new Adjusted Large Load Forecast methodology that discounted utility load projections with observed information in service delays, load ramping schedules and the actual percentage of previously expected data center load in current operation.
Using examples like this, policies can be created to encourage or require planning authorities to develop and use similar methods. In states with regulated, vertically integrated utilities, policymakers could require utilities to develop and evaluate multiple large load growth scenarios as part of their integrated resource planning process. At a regional and national level, FERC Order 1920 mandates that transmission planners use at least three distinct long-term scenarios in their long-term regional transmission planning processes. As these scenarios and subsequent long-term transmission plans are being developed, advocates and state representatives can argue for careful analysis and modeling of load growth that accounts for various sources of uncertainty.
Policymakers should ensure that utilities and modelers use the best practices for this type of modeling to improve the effectiveness of how regulators guide system planning. Efforts to share learnings and establish new standards are already well underway, with groups like NARUC and the Energy Systems Integration Group’s “Large Loads Task Force” actively convening on large load forecasting challenge and solutions.
3) Structure Electricity Tariffs with Provisions to Ensure Long-Term Commitments and Fair Cost Allocation
One of the largest concerns stemming from uncertainty in data center loads is the risk that other ratepayers could end up footing the bill for unneeded infrastructure designed for data centers. Some experts have already raised the alarm about current rate structures and cost allocation methods leading to the possibility of other residential or commercial consumers paying for infrastructure used only to service new data centers. If expected energy demand from data centers is less than was expected ,leaves the system earlier than anticipated or doesn’t materialize, consumers and businesses could be stuck with higher bills .
To address some of these issues, regulators have approved several new and innovative large load tariffs to govern data centers that address this risk. While these tariffs are settled at the regulatory level, state legislators can encourage or require the adoption of specific tariff rules that shield other customers from bill impacts, whether data centers come online or not.
In Oregon, for example, the recently passed POWER Act (HB 3546) creates a new, separate customer category for large load facilities over 20 MW. Rates for customers in this new category are specifically required to directly assign costs associated with serving a large load customer and to mitigate any risk of those costs being shifted to other customers. Furthermore, all electric supply contracts under these tariffs must be for 10 years or longer and require the customer to pay a minimum amount or percentage to the utility based on its projected electricity usage. A similar bill passed in Minnesota also requires the creation of a new customer category for very large energy users and protects against cost shifting to other ratepayers but does not include contract provisions.
Despite Uncertainty, a Clean, Affordable and Reliable Energy System is Possible

Predictions of rapid growth have catapulted data centers to the center of a national conversation on rising energy demand. Looking at the projections, however, reveals that there is significant uncertainty contained both within model estimates themselves and from larger, economy-wide factors. As policymakers at the state and local level are tasked with responding to new data center developments, they should take steps to hedge against this uncertainty to ensure a clean, affordable and reliable energy system for all.