This Is How The AI Tech Behind ChatGPT Was Built – And It Used Up A Lot Of Water


Building advanced AI models like ChatGPT comes with hidden costs that are often undisclosed by tech giants such as Microsoft, OpenAI, and Google. These companies are racing to meet the demand for generative AI, but they seldom reveal the specifics of these expenses. Following are some issues faced by companies that plan on building advanced AI models such as ChatGPT:

Water Usage

One essential but often overlooked cost is water consumption. For instance, OpenAI, backed by Microsoft, required a substantial amount of water from the Raccoon and Des Moines rivers in central Iowa to cool a supercomputer used to teach its AI systems human-like writing skills. This water was crucial for maintaining the supercomputer’s performance.

“It’s fair to say the majority of the growth is due to AI,” including “its heavy investment in generative AI and partnership with OpenAI,” said Shaolei Ren, a researcher at the University of California, Riverside who has been trying to calculate the environmental impact of generative AI products such as ChatGPT.

Electricity and Heat Generation

Developing large language models involves analyzing vast amounts of text data, which demands massive computing power. This process consumes a significant amount of electricity and generates substantial heat. Data centers often rely on water-cooling systems to prevent overheating, increasing their water usage.

“Most people are not aware of the resource usage underlying ChatGPT,” Ren said. “If you’re not aware of the resource usage, then there’s no way that we can help conserve the resources.”

Environmental Impact

Microsoft’s latest environmental report revealed a 34% surge in global water consumption from 2021 to 2022, primarily attributed to AI research. OpenAI’s research estimates that ChatGPT consumes around 500 milliliters of water per interaction, depending on server location and season. This estimate includes indirect water usage, like cooling power plants providing electricity to data centers.

Google also reported a 20% increase in water use, mainly linked to its AI efforts. Google’s water consumption varied by location, with noticeable spikes in Oregon, Las Vegas, and Iowa.

Efforts Toward Sustainability

In response to environmental concerns, Microsoft and OpenAI expressed their commitment to research and improve AI’s energy and carbon footprint. They aim to enhance efficiency in both training and application processes, reduce emissions, and increase their use of clean energy to power data centers.

The Role of Location

The choice of location plays a crucial role in managing the environmental impact of AI research. West Des Moines, Iowa, was selected for Microsoft’s AI supercomputing data center due to its relatively efficient ecological conditions. Iowa’s cooler climate allows Microsoft to use outside air for cooling most of the year, reducing water consumption. Only when temperatures exceed a certain threshold does the center utilize water for cooling.

The race to develop AI models like ChatGPT carries substantial hidden costs, including increased water consumption and electricity usage. Companies like Microsoft and OpenAI are working to address these environmental concerns and reduce their impact while continuing to advance AI technology. As the demand for AI tools continues to grow, tech developers need to prioritize sustainability and minimize their environmental footprint.


Leave a Reply

Your email address will not be published. Required fields are marked *