Tech »  Topic »  The Dark Side of LLMs: Rising Energy and Water Demands Spark Sustainability Fears

The Dark Side of LLMs: Rising Energy and Water Demands Spark Sustainability Fears


Drew Robb

The training of AI models and AI inferencing consumes vast amounts of water. How can energy and water usage be reduced?

Image generated by Google’s Nano Banana

Modern large language models (LLMs) such as ChatGPT and its variants crunch through hundreds of billions, if not trillions, of parameters as they search for answers.

As these models scale, the strain on the underlying infrastructure becomes impossible to ignore.

“With the rise of AI, demands on IT infrastructure, including storage, power, and cooling, are rapidly intensifying,” Rich Gadomski, director of Channel Sales and New Business Development at FUJIFILM North America Corp., told TechRepublic. “These demands create pressure to control costs, reduce energy consumption, and minimize carbon footprints.”

This has raised serious concerns about energy and water usage. OpenAI CEO Sam Altman recently tried to allay such fears by saying the average query uses 10 times less energy than previously ...


Copyright of this story solely belongs to techrepublic.com . To see the full text click HERE