Lawmakers across the political spectrum agree that the energy grid is not ready for the increased demand brought by data centers that power artificial intelligence. A new study from nonprofit Western Resource Advocates looks at how that demand could impact Western states.
The study looked at utility filings across five states: Colorado, Utah, Nevada, Arizona, and New Mexico. Those utilities are forecasting a 4.5% increase in energy demand each year until 2035, due to potential new data centers. Study authors estimate that would be equivalent to the energy usage of two and a half additional Phoenix metro areas in the next ten years.
In addition, AI data centers use large amounts of water to protect and cool down the computers and equipment. Based on the energy demand forecasts from utilities, and assuming that these centers use standard water-based cooling technologies, the study estimates they would use 7 billion gallons in 2035, or 21,600 acre feet. That’s about as much water as used by 63,000 single-family Colorado households in a year.
Deborah Kapiloff, a policy advisor with Western Resource Advocates, and one of the study’s authors, said the report is meant to help lawmakers and regulators understand the resources these data centers require, in order to help them shape the regulatory framework they adopt for these operations.
For example, Kapiloff said, a lot of the concerns in the West and the Colorado River Basin would likely center around water.
“What kind of new usage are we able to squeeze out of a very stretched-thin system?” she said. “And is that usage the best and highest use of the water that we have available?”
She said that at the moment, data centers and AI are like a gold rush, and companies are trying to sign agreements with utilities to connect wherever they can.
“A lot of it is speculative,” she said. “(Utilities) are trying to figure out, in this new paradigm, how much of that load is going to show up.”
Kapiloff said the report is also intended to help these decision makers understand some of the technology being used in these data centers, and the trade-offs they present when it comes to resources.
“For example, using a less water-intensive cooling system is inherently going to use more electricity,” she said. “Those two factors are extremely intertwined and you can't separate them out.”
The report also includes several policy recommendations for lawmakers and regulators. It recommends transparency and reporting requirements for data centers’ water usage, protecting electric customers from rate increases brought on by increased demand, and policies that encourage renewable energy adoption as demand increases.
The Trump administration recently announced a policy plan, designed to speed up permitting data centers for AI on federal lands. The order is designed to “ease federal regulatory burdens,” mainly by sidestepping the National Environmental Policy Act, or NEPA. It asks agencies to identify categorical exclusions — a designation determining there would be no significant environmental impact of an activity — that would allow these AI data center projects to go ahead.
In addition, the federal Department of Energy announced four sites on which it would team up with private sector partners to “develop cutting edge AI data center and energy generation projects.” One of those is located in the West: the Idaho National Laboratory. In a statement, Energy Secretary Chris Wright called it “the next Manhattan Project.”
Copyright 2025 Rocky Mountain Community Radio. This story was shared via Rocky Mountain Community Radio, a network of public media stations in Colorado, Wyoming, Utah, and New Mexico, including Aspen Public Radio.