Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Photo by Andrew Harnik and Getty Images

OpenAI CEO offers a truly bizarre comparison to justify AI’s enormous energy demands, but it’s not what you think

They don't see human life as sacrosanct.

OpenAI CEO Sam Altman offered a truly wild comparison this week to tackle concerns about AI’s massive energy demands, suggesting that training a human takes far more energy than training an AI. Altman, who was speaking at a major AI summit in India, openly addressed the ongoing debate about artificial intelligence’s environmental footprint.

Recommended Videos

First up, Altman tackled the buzz around AI’s water usage, outright calling concerns “totally fake.” He did acknowledge that water consumption used to be a real issue when data centers relied on evaporative cooling, but that’s apparently not the case anymore. He directly pushed back against claims floating around online, stating that ideas like “Don’t use ChatGPT, it’s 17 gallons of water for each query” are “completely untrue, totally insane, no connection to reality.”

However, Altman was quick to concede that the overall energy consumption of AI is a legitimate concern, though not necessarily on a per-query basis. He pointed out that the world is now using so much AI that it necessitates a swift global shift towards sustainable energy sources like nuclear, wind, and solar.

Altman’s argument brings forth Silicon Valley’s deranged outlook on humanity

When asked about a previous conversation with Bill Gates, where it was suggested that a single ChatGPT query uses the equivalent of 1.5 iPhone battery charges, Altman was quick to dismiss it. He stated there’s “no way it’s anything close to that much.”

Altman then voiced his frustration, claiming that many discussions about ChatGPT’s energy usage are “unfair.” He specifically called out comparisons that pit the energy needed to train an AI model against the energy a human uses to perform a single “inference query.” And this is where his argument really took a turn.

Altman argued that it also takes a significant amount of energy to “train a human.” He elaborated on this, saying it requires “like 20 years of life and all of the food you eat during that time before you get smart.” He didn’t stop there, either. He even included the “very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you.”

In Altman’s view, a truly “fair” comparison would be to measure how much energy it takes for a trained AI model to answer a question versus a trained human. He believes that, measured this way, AI has likely “already caught up on an energy efficiency basis.”

What’s really tricky here is that tech companies currently aren’t legally required to disclose how much energy and water they use. This means scientists often have to study these impacts independently, which isn’t ideal. It’s also worth noting that data centers have been linked to rising electricity prices, adding another layer to this complex conversation.


Attack of the Fanboy is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
More Stories To Read
Author