The artificial intelligence industry's infrastructure boom is generating an environmental cost that is becoming impossible to ignore. Natural gas power plants being built across the United States specifically to supply data centers operated by OpenAI, Meta, xAI, and Microsoft have the potential to emit more than 129 million tons of greenhouse gases annually — a figure that exceeds the total annual emissions of Morocco, according to emissions estimates derived from air permit documents examined by WIRED.
The disclosure puts concrete numbers on a tension that has been building for two years: the AI industry's public commitments to sustainability versus the reality of its expanding physical footprint.

The Scale of Buildout
The major AI companies are not building data centers in existing power grids. They are commissioning new natural gas power plants — sometimes multiple plants in the same region — to ensure they have reliable electricity for compute clusters that are growing at rates that would have seemed implausible three years ago.
OpenAI's Stargate program, Meta's infrastructure expansion, xAI's Memphis supercluster, and Microsoft's data center buildout all require electricity at a scale that wind and solar cannot currently provide reliably enough for always-on AI workloads. The result is a deliberate turn toward natural gas as a bridge fuel — a choice that is explicitly being made by companies whose public positions on climate change have been a significant part of their brand.
The 129 million ton figure represents the upper bound of projected emissions from these plants based on the permit applications. Actual emissions will depend on utilization rates, which depend in turn on how quickly the AI companies can build out the compute capacity to use the power they are buying.
The Math Does Not Add Up Yet
For all the talk of AI transforming industries and solving problems, the immediate environmental impact of the AI infrastructure boom is significant and measurable. A single large language model training run can consume as much electricity as a small town uses in a year. Inference — running the model to answer queries — consumes more electricity over time than training.
As AI features proliferate across consumer products and enterprise software, the electricity demand compounds. Every AI-powered search query, every AI-generated document, every AI assistant interaction requires compute that requires electricity.
The industry has been relying on three arguments to deflect criticism: that AI will help solve climate problems, that data center electricity grids will get cleaner over time as renewable capacity grows, and that the economic and social value of AI justifies the environmental cost. None of these arguments address the immediate question of whether building new natural gas plants specifically to power AI is the right choice right now.
A Regional Story With Global Implications
The concentration of new data center construction in specific regions of the United States — particularly the American South and Southwest — means that the environmental impact is not evenly distributed. Communities near these new facilities bear the local costs: increased truck traffic, noise, water consumption for cooling, and strain on local electrical grids.
But the emissions consequences are global. Carbon dioxide emitted from a natural gas plant in rural Virginia contributes to atmospheric concentrations regardless of where it is emitted. The AI industry's infrastructure expansion is effectively making a global bet that its societal value will exceed the environmental cost of the infrastructure needed to build it — a bet that is difficult to evaluate either direction.
The Water Problem
Electricity generation is not the only environmental pressure point. Data centers consume enormous amounts of water for cooling systems. As natural gas plants are built to supply AI facilities, the water demands of those plants — for steam generation and cooling — adds to the draw on local water supplies.
In regions already facing water stress, the combination of data center water consumption and the water needs of natural gas extraction and power generation creates compound environmental pressure. The major tech companies have published water consumption targets, but the pace of new data center construction has in many cases outrun those commitments.
What Comes Next
The emissions data from WIRED's examination of permit documents will likely become a reference point in ongoing policy debates about how to regulate the AI industry's environmental impact. Several states are considering legislation that would require data centers to meet sustainability standards or pay into environmental mitigation funds.
The Federal Energy Regulatory Commission (FERC) has begun looking at whether the surge in electricity demand from AI data centers requires changes to how new power plants are permitted and how grid connections are prioritized.
For the AI companies themselves, the environmental narrative is becoming increasingly difficult to separate from the story of AI's societal role. Companies that have made sustainability part of their brand are finding that the physical reality of building AI infrastructure is not always consistent with those commitments.
The question of whether AI's benefits outweigh its environmental costs is a legitimate one that will not be resolved by ignoring the costs. The 129 million ton figure from the permit documents is not a projection that can be revised away. It is a consequence of decisions that have already been made.



