Challenging the Narrative
OpenAI CEO Sam Altman recently addressed concerns regarding the significant water and energy consumption associated with artificial intelligence, particularly
in the context of large language models like ChatGPT. During a visit to India, Altman strongly refuted claims that individual AI queries consume vast amounts of water, labeling such assertions as "completely untrue" and "insane." He emphasized that these widespread figures lack any grounding in reality. While his comments were met with approval from some who applauded the call-out of misinformation, they also drew considerable criticism from others who felt he was downplaying a serious issue. The surge in demand for AI technologies has led to an exponential increase in the energy and water requirements of data centers worldwide, making sustainability a pressing concern for the tech industry and beyond.
Unpacking AI's Water Footprint
The core of the debate lies in understanding how water is utilized in AI operations. Advanced AI models, such as ChatGPT and Google's Gemini, are powered by vast arrays of servers housed in data centers. These servers generate considerable heat, necessitating robust cooling systems to maintain optimal performance. Water is a primary medium for this cooling process, and while indispensable, it results in substantial withdrawals from local water resources. Altman's stance suggests that the per-query water usage is often exaggerated. Independent analyses offer a more nuanced perspective: a moderately lengthy GPT-5 response (around 150-200 words) is estimated to consume approximately 19.3 watt-hours of electricity, translating to roughly 25 to 39 milliliters of water, contingent on the cooling system's efficiency. For comparison, GPT-4o uses significantly less, around 1.75 watt-hours of energy and about 3.5 milliliters of water. Google's Gemini model reportedly requires a mere 0.26 milliliters of water per text prompt. Although these individual figures appear small, the cumulative impact across billions of daily interactions globally is substantial, with projections indicating that water usage for data center cooling could triple by 2050 due to accelerating AI adoption.
Calculating AI's Thirst
Estimating the water footprint of AI can be achieved through a straightforward three-step calculation. First, it's essential to identify reliable data on the energy consumption of AI models per response. For instance, a medium-length GPT-5 output is cited as using 19.3 watt-hours, while a GPT-4o response uses 1.75 watt-hours. The second step involves applying a 'water factor,' which industry studies suggest ranges from 1.3 to 2.0 milliliters of water per watt-hour. The lower end of this spectrum reflects highly efficient data centers with advanced cooling mechanisms, whereas the higher end accounts for average facilities. The final step is simple multiplication: Energy (Wh) × Water factor (ml/Wh) = Water used per query (ml). Applying this to GPT-5, using the higher water factor of 2.0 ml/Wh, results in 39 milliliters per response. For GPT-4o, with the same factor, it's 3.5 milliliters. Even when accounting for the most efficient data centers, the figures remain significant: approximately 25 ml for GPT-5 and 2.3 ml for GPT-4o per query. When this is multiplied by the billions of queries processed daily, the environmental implications become substantial.
Energy: The Core Challenge
While the specifics of water usage in AI may be subject to exaggeration, Sam Altman readily acknowledged that energy consumption represents the more profound environmental concern. He stated that the issue is not about individual queries but the aggregate energy demand as AI becomes more widely adopted globally. Altman stressed the urgent need for the industry to transition rapidly towards cleaner energy sources like nuclear power or renewable energy from wind and solar. He controversially compared the energy required to train an AI model to the energy expended in human development over 20 years, including lifelong sustenance. This analogy, equating technological training with human growth, drew sharp criticism from figures within the tech community, who rejected the comparison. However, Altman's underlying point highlighted that once AI models are trained, their subsequent operational energy requirements for generating responses (inference) can be relatively low, potentially rivaling human cognitive efficiency in terms of energy per task.
The Sustainability Paradox
Altman's statements have brought renewed attention to a fundamental question: can the rapid advancement of AI truly align with global climate objectives? Currently, data centers are significant energy consumers, reportedly using as much power as entire countries like Germany or France, and this demand is on an upward trajectory. Governments face the dual challenge of expanding clean energy infrastructure while also navigating public opposition to large-scale projects that could strain resources. A notable example is the San Marcos City Council in Texas, which recently rejected a $1.5 billion data center proposal following public outcry over its anticipated electricity demand and water usage. In response to these environmental pressures, technology companies are exploring innovative cooling solutions. Microsoft, for instance, is testing zero-water cooling systems that directly circulate liquid through chips, drastically reducing water evaporation. Other methods, such as immersion cooling, where servers are submerged in non-conductive fluids, are also under development. However, these advanced technologies are currently expensive and pose significant challenges for widespread implementation.
AI's Climate Equation
Sam Altman's defensive posture, while perhaps an effort to combat misinformation, inadvertently underscored a critical reality: the environmental footprint of AI remains poorly understood by the public. Even if the water and energy consumed per individual AI interaction seems minimal, the sheer scale of billions of daily operations results in an enormous cumulative impact. AI possesses significant potential to aid in climate change mitigation, offering tools for optimizing power grids, improving weather forecasting, and enhancing carbon capture technologies. Paradoxically, its own substantial demand for energy and water could undermine these very benefits. The ultimate challenge for Altman and the broader technology sector is not merely to address the misinformation surrounding AI's resource use, but to fundamentally develop and deploy AI systems that operate with greater efficiency and sustainability. This imperative grows more urgent as global resources dwindle and public patience wears thin.














