Share this @internewscast.com
OpenAI’s CEO, Sam Altman, has addressed concerns about the energy consumption of artificial intelligence, suggesting that training a human also demands significant energy resources.
“There’s often an unfair comparison,” Altman explained, “where people look at the energy required to train an AI model versus the energy needed for a human to perform a similar task.”
According to the International Energy Agency (IEA), data centers worldwide consumed approximately 460 terawatt-hours (TWh) in 2022 alone.
Altman dismissed worries about water usage as “completely unfounded,” though he acknowledged past practices, saying, “We used to rely on evaporative cooling in data centers.”
“We no longer do that,” he clarified. “There’s a lot of misinformation online, like claims saying ‘don’t use ChatGPT because it consumes 17 gallons of water per query,’ which are simply not true.”
“You see these things on the internet where [a post says] ‘don’t use ChatGPT, it’s 17 gallons of water for each query’ or whatever.
“This is completely untrue, totally insane, no connection to reality.
“What is fair though is the energy consumption, not per query, but in total because the world is now using so much AI.
“It is real and we need to move towards nuclear or wind and solar very quickly.”
NEVER MISS A STORY: Get your breaking news and exclusive stories first by following us across all platforms.