Skip to main content

Somewhere between lines of code and server farms, a hidden cost has been accumulating. Most people who type a prompt into ChatGPT or ask Gemini to summarize an article never consider what happens beyond their screen. But behind every AI response lies a network of machines humming in warehouses around the world, drawing power and water at rates that would alarm most city planners.

New research now puts a number on that cost, and it rivals the output of one of America’s largest metropolitan areas.

A Year of AI Equals New York City’s Carbon Output

Alex de Vries-Gao, a Dutch researcher and founder of Digiconomist, has spent years tracking the unintended consequences of digital technology. His latest study, published in the peer-reviewed journal Patterns, attempts something no previous research has accomplished. Rather than measuring data centers as a whole, de Vries-Gao sought to isolate the environmental burden of AI systems alone.

His findings paint a striking picture. AI systems may have generated between 32.6 and 79.7 million tons of CO2 in 2025. At the upper end, that figure exceeds the annual carbon output of Chile, Czechia, and Romania. It also brackets New York City’s 52.2 million tons from 2023, meaning AI’s single-year carbon footprint could match or surpass one of the world’s most iconic urban centers.

“The environmental cost of this is pretty huge in absolute terms,” de Vries-Gao said. “At the moment society is paying for these costs, not the tech companies. The question is: is that fair? If they are reaping the benefits of this technology, why should they not be paying some of the costs?”

Estimated greenhouse gas emissions from AI use now equal more than eight percent of global aviation emissions. Yet unlike airplanes leaving visible contrails across the sky, AI’s carbon output remains largely invisible to those who benefit from it most.

Power Demand Approaching That of an Entire Nation

Understanding AI’s environmental impact requires grasping the sheer scale of its electricity appetite. According to de Vries-Gao’s research, AI hardware power demand stood at 9.4 gigawatts at the end of 2024. By the close of 2025, that figure could reach 23 gigawatts.

For comparison, the United Kingdom averaged 30.7 gigawatts of power demand in 2023. AI systems alone now approach the electricity needs of a mid-sized developed nation.

International Energy Agency estimates suggest AI accounted for roughly 15 to 20 percent of data center electricity demand in 2024. But data centers themselves are growing at a staggering pace. IEA projections indicate that data center electricity consumption will more than double by 2030, with AI-focused facilities drawing power at rates comparable to aluminum smelters.

Individual facilities can be enormous. According to IEA reports, the largest AI-focused data centers being built today will each consume as much electricity as two million households. In the United States alone, data centers account for 45 percent of global data center electricity consumption, followed by China at 25 percent and Europe at 15 percent.

Construction shows no signs of slowing. In the United Kingdom, an estimated 100 to 200 hyperscale data centers sit in planning systems. One facility planned for a former coal power station in Blyth, Northumberland, would emit more than 180,000 tonnes of CO2 annually when fully operational, equivalent to the output of 24,000 homes.

Water Use Surpasses Global Bottled Water Consumption

Human hand holding a bottle of water

Carbon emissions tell only part of the story. De Vries-Gao’s research also examined water consumption, producing what he claims is the first estimate of AI’s specific water footprint.

His calculations suggest AI systems consumed between 312.5 and 764.6 billion liters of water in 2025. At the higher end, that figure exceeds the 446 billion liters of bottled water humanity drinks each year. AI’s thirst, in other words, may now surpass the entire global bottled water industry.

Water enters the AI equation in two ways. Direct water consumption occurs when data centers use water-based cooling systems to prevent their machines from overheating. While some facilities recycle cooling water, much of it evaporates and leaves the local ecosystem permanently.

Indirect water consumption proves even larger. Power plants generating electricity for data centers consume vast quantities of water themselves. Meta, one of the few companies reporting both metrics, revealed that its indirect water consumption was roughly four times higher than its direct use.

Local communities have begun feeling the pressure. Newton County, Georgia, has reportedly experienced rising water prices, damaged wells, and faces a water deficit by 2030 after Meta built a data center there. In Phoenix, Arizona, one report found that water stress will rise by 32 percent if all planned data centers are constructed.

Why Exact Numbers Remain Elusive

Perhaps the most troubling finding from de Vries-Gao’s research concerns not what we know, but what remains hidden. No major technology company reports AI-specific environmental metrics. Researchers must instead approximate AI’s impact through general data center performance, a method that introduces substantial uncertainty.

Corporate disclosure practices vary wildly. Meta provides the most detailed breakdown, reporting electricity consumption and carbon emissions for individual data centers. Apple offers similar location-specific data but omits location-based carbon emissions. Google discloses water consumption per facility but not electricity use at that level.

Meanwhile, Amazon reports the largest location-based scope-2 emissions of any company examined, but fails to disclose its total electricity consumption. ByteDance and CoreWeave publish no environmental reports at all.

Even companies that do report face no requirements to separate AI workloads from other computing tasks. Without such distinctions, researchers can only estimate AI’s contribution by analyzing broader trends in facility performance.

“Despite AI system power demand approaching that of a country the size of the United Kingdom, the environmental impacts of this growth remain unclear,” de Vries-Gao wrote. He noted that without transparent data, identifying opportunities to reduce AI’s climate impact becomes nearly impossible, and measuring the success of any intervention remains hidden.

Carbon Intensity Varies Wildly by Location

Location matters enormously when calculating environmental impact. A data center powered by renewable energy in Oregon carries a vastly different carbon footprint than one running on coal-fired electricity in the Midwest.

De Vries-Gao’s analysis found carbon intensity ranging from 0.17 to 0.46 tCO2/MWh across U.S. data center locations operated by major tech companies. American and European companies generally reported lower carbon intensities, between 0.32 and 0.35 tCO2/MWh, reflecting the relatively cleaner power grids in those regions.

Chinese companies told a different story. Baidu reported a carbon intensity of 0.64 tCO2/MWh, while Tencent came in at 0.57 tCO2/MWh. As AI adoption accelerates in regions with dirtier power grids, global carbon emissions could rise faster than current estimates suggest.

Water intensity shows even greater variation. Power grids supporting data center locations for Apple, Google, and Meta ranged from 0.68 to 11.98 liters per kilowatt-hour. A data center in Prineville, Oregon, could consume nearly 18 times more water per unit of electricity than one in Fort Worth, Texas, simply due to differences in how local power plants generate electricity.

Without location-specific disclosure from AI operators, researchers cannot determine where systems are running or how much they are consuming. Averages provide useful starting points, but they obscure enormous regional differences.

Google, Microsoft, and Meta Acknowledge AI’s Growing Appetite

Major technology companies have begun acknowledging AI’s role in their rising resource consumption, even as they resist disclosing specific metrics.

Google’s environmental report attributed electricity growth to AI, stating that demand for digital services has grown rapidly, which in turn creates demand for data centers requiring increased energy for operations and water for cooling. Microsoft noted that AI workloads drive increased computing resource needs. Meta acknowledged that the challenge of reaching sustainability goals, given increased demand for energy and resources driven by AI, is not unique to its company.

Yet acknowledgment has not translated into transparency. When Google recently reported on the environmental impact of its Gemini AI model, it omitted indirect water consumption entirely. According to a separate report, Google chose not to disclose embedded water use because it does not fully control the water consumption in electricity generation.

De Vries-Gao dismissed this reasoning, noting that indirect water consumption results directly from Google’s electricity demand. Greenhouse Gas Protocol standards already mandate disclosure of indirect emissions from purchased electricity. Similar logic should apply to water.

Google did report some progress, noting a 12 percent reduction in data center energy emissions in 2024 due to new clean energy sources. But the company acknowledged that achieving its climate goals is now more complex and challenging across every level, from local to global.

A Call for Mandatory Disclosure

De Vries-Gao argues that voluntary disclosure has proven insufficient. He advocates for new policies requiring technology companies to report AI-specific environmental metrics, including electricity consumption, carbon emissions, and both direct and indirect water use at the facility level.

Environmental groups have grown increasingly vocal. More than 230 organizations recently urged Congress to impose an immediate national moratorium on new data center construction. Critics argue that the public bears the environmental burden while some of the world’s richest companies reap the benefits.

“This is yet more evidence that the public is footing the environmental bill for some of the richest companies on Earth,” said Donald Campbell, director of advocacy at Foxglove, a UK nonprofit campaigning for fairness in technology. “Worse, it is likely just the tip of the iceberg.”

Some data centers have begun experimenting with more sustainable designs. Microsoft’s 2025 environmental report mentioned launching a new data center design that optimizes AI workloads and uses zero water for cooling. But without broader transparency requirements, such innovations remain exceptions rather than industry standards.

What Our Hunger for Intelligence Costs the Planet

Beyond the statistics lies a deeper question about human ambition and its consequences. Every civilization has sought to extend its reach, to build tools that amplify human capability. Fire, writing, electricity, and computing each transformed what our species could accomplish. Artificial intelligence represents the latest chapter in that long story.

Yet AI strikingly differs from previous technologies. Its environmental toll accumulates invisibly, hidden behind screens and dispersed across server farms in locations most users will never see. When we ask a chatbot to write an email or generate an image, we experience only the magic of instant response. We do not see the water evaporating from cooling towers or the power plants burning fuel to keep servers running.

De Vries-Gao’s research forces a reckoning with that hidden cost. When our most advanced tools consume resources at the scale of major cities, we must ask what boundaries should guide their deployment. Progress has never been free, but awareness of its price changes how we weigh its value.

Perhaps the most important lesson lies not in the specific numbers but in what they reveal about accountability. Technology companies have built systems that now rival nations in their demand for Earth’s resources. Yet they resist disclosing the full extent of that demand, leaving society to absorb costs it cannot measure.

If humanity intends to keep pushing boundaries, we must also accept responsibility for what those boundaries cost. Intelligence, whether human or artificial, carries an obligation to confront uncomfortable truths. Our hunger for smarter machines has created a footprint we can no longer ignore.

Loading...

Leave a Reply

error

Enjoy this blog? Support Spirit Science by sharing with your friends!

Discover more from Spirit Science

Subscribe now to keep reading and get access to the full archive.

Continue reading