top of page

The Hidden Cost of AI: How Data Centers Are Draining Urban Energy and Water

  • Writer: theconvergencys
    theconvergencys
  • 2 days ago
  • 5 min read

By Jiweon Kim Oct. 31, 2025


ree

When people consider issues related to the growth of artificial intelligence (AI), the main topics that come to mind include intellectual (dis)honesty, increased social alienation, loss of privacy, and potential job losses and the resultant impact on economies. Lost in the conversation surrounding these problems is the overlooked matter that affects every community that is home to an AI data center: the negative impacts of increased energy consumption. In recent years, cities worldwide have embraced computer-intensive data centers in a bid to attract investment from corporations seeking digital innovation engines. While those investments can start servers running and algorithms heaving, tiny cracks of infrastructural strain become apparent, foreshadowing larger problems to come. By 2030, data centers propelled by the rapid growth of AI are projected to quadruple their global electricity usage and devour roughly 945 terawatt-hours (TWh). That amounts to more than the combined annual consumption of Germany and Australia, and nearly 1.5 times that of the entire United Kingdom. However, the crux of the problem lies not in the staggering energy consumption itself, but in the inadequate capacity of the urban infrastructure tasked to sustain these data centers. When demand for AI spikes at inopportune hours or seasons, local residents bear the repercussions: strained energy grids, parched reservoirs, and heat dumped into neighborhoods already too hot to bear.


Hidden Peaks and Gridlock: How AI Strains Electricity Networks

Most existing urban electricity networks were not designed to support the relentless demand of AI clusters. The process of collection and data distribution requires constant loads of high-density energy, which often peak during evenings, when residential demand is also highest. Such intense spikes in demand stress local grids beyond their intended limits, which can result in unreliable service and the need to boost electricity generation with fossil-fuel peakers that degrade air quality. While some city grids are technically able to withstand large demands of data centers, this is often at the cost of sacrificing grid capacity for other needs, such as clean energy projects and advancements. A grid access approval system that rarely considers grid resilience or accessibility to renewable energy further compounds the problem. Naturally, corporations have greater resources and quicker access to licensing procedures, so they often obtain grid connections before sustainable or public projects have a chance. As a result, public needs and sustainable solutions are delayed, while energy-intensive AI operations move ahead first with little regard for time, place, and environment.


Water and Heat, the Invisible Costs of Cooling AI

Cooling AI clusters is a water-intensive endeavor. While the array of computers/processors consume much electricity, they simultaneously emit a great quantity of heat. Systems to control that excess heat require vast amounts of water. Furthermore, the problem is not just how much water is used, but when and where. These cooling systems draw most heavily during hot, dry periods—which is when local water supplies are already strained. Indeed, the effects of similar data centers might vary greatly based on local hydrology, competing users, and reusable infrastructure. Since the majority of jurisdictions lack reliable water usage effectiveness (WUE) reporting or seasonal/hourly usage limitations, there is little that urban water planners can do to prevent shortages. As a result, local residents may face water use restrictions, farmers may experience damage to crops, and first-responders risk the inability to properly handle fires and other emergencies. In addition to the excessive use of water, the heat itself can be problematic. Data centers eject immense volumes of low-grade heat into urban areas, exacerbating heat-island effects. This, in turn, spurs residents to use air-conditioning more, thus further straining the electrical grid and further contributing to fossil fuel usage.


The Governance Gap: Why Permits Fail and Standards Fall Short

Government actions to address these problems are often inadequate. Many places put limits on electrical usage, but the way such limits are measured do not account for peak usage times. For example, crucial hourly spikes that lead to system instability are ignored by ESG reporting, which focuses instead on annual consumption. In addition, the ESG alignment image of some corporations is commonly based on paltering. Similarly, power permits assess site efficiency, water regulators consider average draw, and utilities examine technical feasibility, but none of them captures the full picture of a facility's impact. As a result, AI projects are accepted without taking into account how their real-time operations interact with local needs. Mere lists of best practices or voluntary standards cannot address these needs. Cities must implement enforceable frameworks that account for when and where strain occurs—not just how much. A good place to start is the EU's Code of Conduct on Data Centre Energy Efficiency. However, this kind of code needs to include requirements that tie location- and time-sensitive planning into permitting regulations and grid connections.


Principles for Resilient Urban Compute

What can cities do to encourage the construction of AI data centers but also mitigate their harmful impact? Dublin, Singapore, and Odense offer some insight.

In Dublin, Ireland, the percentage of metered electricity used by data centers grew from 5% to 22% in just ten years. In response, the government tightened its siting and interconnection rules, giving preference to projects that truly matched grid capacity and system resilience. In Singapore, the government introduced a “gate-open with conditions” strategy to account for limits on land, grid, and water. This strategy restricts the capacity of new data centers until operational transparency and efficiency are successfully improved. The Green Data Centre Roadmap aims to unlock 300 MW of additional capacity. In Odense, Denmark, municipal policies addressed the problem of excess heat. They use large ammonia heat pumps to transfer data center server heat into district heating. They redirect roughly 100,000 MWh/year—enough to heat thousands of homes. The examples set by these cities reveal some precise principles and criteria for impact mitigation of AI data centers. First, cities must treat time and place—not just volume—as the unit of account. Second, facilities should use clean local resources to match their hourly load or use load shifting to show comparable grid-friendly behavior. Third, the most intensive processes must be scheduled outside of drought seasons in order to preserve water reserves. Additionally, waste heat must be absorbed into local systems wherever applicable. Last and most crucially, cities must ensure these expectations are binding by regulating infrastructure expenditures and permit approvals.


Conclusion

As the age of artificial intelligence integrates into daily life as a revolutionary point that rivals ‘the internet,’ natural concerns and disputes also surface. Such concerns involving honesty, employment, privacy, and information saturation, perhaps the need for greater focus on environmental impact is warranted. Mitigating the environmental burden caused by AI necessitates a focus on when and where energy is utilized—and its cost to the surrounding systems. Cities that understand their unique vulnerabilities and assets will set the standard for a resilient digital future. Those that fail to understand this may find their streets, skies, and reservoirs paying the price for progress that arrived too fast and unchecked.

Comments


bottom of page