In the quest to develop powerful artificial intelligence (AI) algorithms, the energy consumption of these systems has long been a concern. However, a groundbreaking study conducted by researchers from the University of California Riverside and the University of Texas Arlington sheds light on a lesser-known aspect of AI’s environmental impact: its staggering water consumption.
Published in a yet-to-be-peer-reviewed paper titled “Making AI Less Thirsty,” this research delves into the water requirements for cooling data centers that facilitate AI training. The study highlights the immense quantities of water needed to support AI endeavors by industry giants such as OpenAI and Google.
As the researchers investigated the water usage in cooling data centers during the training of GPT-3, jaw-dropping statistics emerged. Astonishingly, Microsoft, in collaboration with OpenAI, consumed an astounding 185,000 gallons of water while training GPT-3 alone. To put this into perspective, the authors calculated that this amount of water would be sufficient to cool a nuclear reactor.
The paper goes on to outline further mind-boggling figures: the water utilized by Microsoft to cool its U.S.-based data centers during GPT-3’s training could have been used to manufacture “370 BMW cars or 320 Tesla electric vehicles.” These numbers would have tripled if the training had occurred in the company’s larger data centers in Asia.
But the water consumption does not end there. Even during the inference phase of AI models like ChatGPT, the paper emphasizes that the system still requires the equivalent of a 500 ml bottle of water for a simple conversation consisting of approximately 20-50 questions and answers. Although a 500 ml bottle may seem inconsequential, the cumulative water footprint resulting from the billions of users interacting with ChatGPT is undeniably significant.
Given the mounting concerns regarding water scarcity and the urgent need to address this issue, the researchers urge companies like Google and OpenAI to assume social responsibility and take the lead in reducing their water footprints. This entails a proactive approach to tackling the insatiable “thirst” of AI algorithms.
However, solutions to this formidable challenge remain elusive. The researchers offer limited guidance on how to navigate this complex issue. Nonetheless, it is vital for industry leaders to embark on a path of introspection and develop comprehensive strategies to curb the water consumption associated with AI systems.
As the world witnesses the ever-growing capabilities of AI, it becomes increasingly apparent that technological progress must be accompanied by sustainable practices. The development of groundbreaking AI algorithms, such as ChatGPT, should not come at the cost of depleting our precious water resources. By acknowledging the water footprint of AI and proactively addressing this issue, companies can pave the way for a more responsible and sustainable future.