Share this @internewscast.com
“Our collaboration with OpenAI is likely to generate tens of billions in revenue for AMD, while also speeding up OpenAI’s development of AI infrastructure,” stated AMD’s chief financial officer, Jean Hu.
This deal was announced just days after OpenAI’s CEO, Sam Altman, disclosed plans to establish AI data centers across the globe. These centers could consume up to 10 gigawatts of power, with an existing usage of another 17 gigawatts.
That amount is comparable to the 10 gigawatts New York City uses in the summer and five gigawatts used by San Diego during its 2024 heatwave. 
Experts have also likened the energy consumption used by OpenAI to the amount required by entire countries.
“Ten gigawatts is more than the peak electrical demand of countries like Switzerland or Portugal, and seventeen gigawatts could essentially power both nations at once,” explained Cornell University professor Fengqi You.
Andrew Chien, a professor of computer science at the University of Chicago, mentioned to Fortune that the energy usage might match what “the entire economy consumes”.
“I’ve been a computer scientist for 40 years, and for most of that time, computing was the tiniest piece of our economy’s power use.
“A year and a half ago, they were talking about five gigawatts.
“Now they’ve upped the ante to 10, 15, even 17.
“There’s an ongoing escalation.”
“While there are beneficial applications of this technology, it is frequently used to substitute human labor without providing real benefits to individuals,” she commented.
“Additionally, more generative AI doesn’t necessarily mean more accurate or safer technology.” 
Altman recently claimed ChatGPT has surpassed 800 million weekly active users.
”More than 800 million people use ChatGPT every week, and we process over 6 billion tokens per minute on the API.”
“Thanks to all of you, AI has gone from something people play with to something people build with every day,” he said.