AI's Hidden Thirst: Healthcare, Data Centres and the Communities Paying the Price

AI can ease healthcare strain but brings hidden costs: energy, water, noise, local stress. Leaders need guardrails, First Nations water equity, and clear impact metrics.

Categorized in: AI News Healthcare
Published on: Sep 15, 2025
AI's Hidden Thirst: Healthcare, Data Centres and the Communities Paying the Price

AI in healthcare has an environmental blind spot

AI is being promoted as a fix for workforce pressure, access, and patient flow. But its environmental footprint-energy, water, noise, and local infrastructure strain-rarely makes it into the decision process.

When communities push back on data centres, they are flagging costs that healthcare ends up feeling: higher energy bills, water stress, and equity impacts. If AI is going to sit inside essential services, its resource use becomes a clinical risk and a public health concern.

Why this matters to healthcare leaders

Healthcare is already responsible for roughly seven percent of Australia's emissions. Adding compute-intensive AI without strict guardrails can move hospitals and health networks further from decarbonisation targets.

Water use is the sleeper issue. Many AI workloads depend on cooling systems that consume large volumes of potable water. For regions already under stress, this is not a technicality-it is a supply and equity problem.

First Nations water equity must be part of AI planning

For many First Nations communities, water scarcity is daily reality. Research across the Daly and Fitzroy catchments showed river flows are tied directly to Indigenous livelihoods, governance, and culture.

Deploying AI-powered services without accounting for water and energy demand shifts the burden onto communities least consulted. That is a trust issue as much as a sustainability issue.

  • Engage Traditional Owners early for any AI-linked infrastructure or major cloud migrations that could affect local water or energy systems.
  • Include cultural water values and place-based risks in business cases, privacy impact assessments, and digital transformation plans.
  • Disclose the expected energy and water profile of AI services used in Indigenous health programs.

The resource bill behind common AI use cases

Data centres used about 460 TWh of electricity in 2022, with demand expected to more than double by 2026 as AI workloads grow. Evaporative cooling can drive heavy potable water consumption, and higher server densities for AI increase that demand.

Training a single large model has a meaningful footprint. One study estimated training GPT-3 evaporated roughly 700,000 litres of fresh water. UC Riverside suggests that running 20-50 ChatGPT queries uses around half a litre. Google's US data centre cooling water reportedly rose from 12.7 to over 30 billion litres in three years, and US data centres may grow from ~4.4% of national electricity demand in 2023 to near 12% by 2028.

AI can help with climate goals, but the net effect is not guaranteed. The UN Environment Programme has stated plainly that AI has an environmental problem and must be integrated with clear sustainability guardrails, not just transparency pledges. UNEP guidance and the IEA's data centre analysis are useful starting points for due diligence. See: IEA: Data centres and networks.

What current policy misses

Government reports highlight productivity and service gains but often treat higher energy demand as business-as-usual. Regulatory consultations on AI in medical software have focused on safety and accountability while omitting environmental impacts.

Internationally, water use by data centres is drawing tighter scrutiny. The EU's Water Resilience Strategy is moving toward usage limits, and UN-backed initiatives call for sustainability embedded into AI strategies, not bolted on.

A practical checklist for hospitals, PHNs, and vendors

  • Require environmental disclosures for all AI tools and hosting: location, grid mix, cooling method, water stress level, and targets.
  • Demand audited metrics: PUE, WUE, CUE; hourly renewable matching claims; and site-level water sourcing (potable vs recycled).
  • Prefer operators using water-free or low-water cooling in stressed basins, on-site heat recovery, and firmed renewables (PPAs plus storage).
  • Set default "small-first" configurations: smaller models for routine tasks, token limits, caching, and scheduled batch runs during low grid stress.
  • Track unit impacts: kWh, litres, and kgCO2e per 1,000 inferences; report alongside clinical KPIs.
  • Include an environmental impact statement in business cases for clinical AI; approve only when clinical benefit clearly outweighs the resource cost.
  • Build green SLAs: annual caps for energy and water per user, improvement targets, and penalties for breaches.
  • Guide staff use: triage queries, avoid speculative prompts, prefer on-device or local models for simple tasks where clinically safe.
  • For on-prem or colocated builds: account for noise, backup diesel emissions, and heat; consult neighbours early and disclose expected operating profiles.
  • Formalise Indigenous engagement protocols for any AI service likely to draw on water or land-constrained infrastructure.

Metrics that matter

  • PUE: aim below 1.2 where feasible.
  • WUE: near 0.2 L/kWh or lower in low-stress regions; avoid potable water cooling in stressed basins.
  • CUE: disclose calculation method and grid factors; require third-party verification.
  • Unit reporting: litres and kgCO2e per 1,000 inferences, and per patient served for major clinical systems.
  • Transparency: publish an annual AI environmental report as part of clinical safety and quality governance.

For regulators and funders

  • Require environmental disclosure for AI-enabled medical software used in public services, including hosting region, WUE/PUE, and energy sourcing.
  • Make environmental impact assessments standard for large health AI deployments and data migrations receiving public funding.
  • Adopt nationally consistent reporting of AI energy and water use; align with international best practice and water-resilience policies.
  • Tie funding to measurable reductions in litres and kWh per clinical outcome, not generic IT efficiency claims.
  • Support research on water-sparing AI methods and clinical effectiveness per unit of resource use.

What to do next week

  • Inventory all AI features in your stack, including "hidden" assistants in EHRs and dictation tools. Map where they run and what they consume.
  • Ask vendors for current PUE, WUE, grid mix, and location-specific water risk. If they cannot provide it, pause expansion.
  • Flip defaults to smaller models for admin tasks; batch non-urgent jobs; set monthly caps on tokens and GPU-hours per department.
  • Add a one-page environmental appendix to every AI-related governance paper and procurement.
  • Engage local communities and Traditional Owners before committing to new data centre footprints or region moves.

If you are setting up internal training on safe, efficient AI use for clinical and admin teams, see our curated options by role: Complete AI Training: Courses by Job.