Sam Altman Says Training a Human Takes Energy Too
The OpenAI CEO’s defense of AI’s environmental footprint reveals more about Silicon Valley’s relationship with humanity than it does about energy efficiency.
Sam Altman has a new talking point, and it’s a doozy.
Speaking at the India AI Impact Summit this week, the OpenAI CEO was asked about AI’s environmental footprint - the electricity, the water, the carbon. His response? It’s unfair to compare AI’s energy consumption to humans, because humans are inefficient too.
“It also takes a lot of energy to train a human,” Altman explained. “It takes like 20 years of life, and all the food you eat before that time, before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you.”
“Science and whatever”… sheesh.
Altman’s argument is that the “fair comparison” isn’t training cost; rather, it’s per-query efficiency once the system is running. “If you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human? And probably AI has already caught up on an energy efficiency basis, measured that way.”
This framing is revealing in ways Altman probably didn’t intend.
The Numbers Don’t Actually Help His Case
Let’s take Altman’s preferred comparison seriously. The human brain consumes roughly 20 watts, about the same as a dim light bulb. It does this while simultaneously regulating your heartbeat, processing sensory input, maintaining consciousness, and answering questions.
A single high-end GPU like NVIDIA’s H100 draws around 700 watts. A data center rack full of them draws enough power to run dozens of homes. And unlike a human brain, it can only do one thing: compute.
The IEA estimates global data center electricity consumption at 415 TWh in 2024, which is about 1.5% of global electricity, roughly equivalent to France's total electricity consumption. By 2030, that’s projected to more than double to 945 TWh under their base case scenario. Accelerated AI servers specifically are growing at 30% annually.
In the United States alone, data centers consumed 183 TWh in 2024, which is more than 4% of total U.S. electricity consumption, and equivalent to Pakistan’s entire national demand. By 2030, that figure is projected to grow 133% to 426 TWh.
These aren’t numbers that suggest AI has “caught up” in terms of efficiency. They suggest AI is consuming resources at a pace that’s straining power grids, raising household electricity bills, and (despite Altman’s dismissals) using significant amounts of water for cooling.
The Comparison Is Philosophically Grotesque
But set aside the math for a moment. There’s something deeper going on in Altman’s framing.
When you reduce human development to its energy inputs - ”20 years of life and all the food you eat” - you’re adopting a very specific worldview. You’re treating humans as production systems. You’re measuring childhood, education, growth, relationships, and lived experience as costs to be optimized rather than as... life.
Sridhar Vembu, co-founder of Zoho Corporation, was at the summit and publicly pushed back, saying, “I do not want to see a world where we equate a piece of technology to a human being.”
Neither do I. And neither should marketing leaders who are trying to build trust with actual humans.
The comparison also conveniently ignores what humans produce beyond “inference queries.” A human who consumes resources for 20 years also creates art, builds relationships, raises children, provides care, participates in communities, and contributes to society in ways that can’t be measured in joules per response. A data center does none of these things. It answers questions and generates content. That’s it.
To say “it takes energy to train a human, too,” as a defense of AI’s environmental impact is like saying “it costs money to raise a child” as a defense of expensive furniture. The categories aren’t comparable because the value propositions aren’t comparable.
The Accountability Dodge
What Altman’s framing does is (unfairly) shift responsibility.
If AI’s resource consumption is comparable to human resource consumption, then AI companies aren’t doing anything unusual. They’re just participating in the normal business of existence. No special accountability required.
But AI companies are making choices. They’re choosing to train ever-larger models. They’re choosing to deploy data centers in locations with carbon-intensive power grids. They’re choosing to prioritize speed and capability over efficiency. These are business decisions with environmental consequences, and those consequences should be owned.
Altman acknowledged that total energy consumption is a legitimate concern, then immediately pivoted to saying the solution is for the energy sector to “move towards nuclear or wind and solar very quickly.” So, OpenAI gets to keep scaling, and if the environmental impact is bad, that’s someone else’s problem to solve. Convenient!
The IEA notes that data centers are one of the few sectors where emissions are projected to grow through 2030, even as most other sectors decarbonize. And that’s the result of choices made by companies that could choose differently.
The Water Question
Altman was particularly dismissive about water consumption, calling concerns “totally fake” and claims about ChatGPT using “17 gallons of water per query” as “completely untrue, totally insane.”
He’s right that the specific viral numbers are exaggerated. But the underlying concern isn’t fake at all.
Water companies have warned that water drawn for data center cooling is expected to triple over the next 25 years. A ScienceDirect study estimates AI’s water footprint could reach 312 to 765 billion liters in 2025, which is comparable to global annual bottled water consumption. The Berkeley Lab found U.S. data centers were responsible for 800 billion liters of indirect water consumption in 2023.
Altman noting that OpenAI has moved away from evaporative cooling doesn’t make water concerns “fake” for an industry that largely hasn’t. And it doesn’t address the water embedded in electricity generation, which varies dramatically by grid mix.
What This Means for Brand Leaders
If you’re a CMO or marketing executive, Altman’s comments matter for reasons beyond environmental ethics.
Consumer sentiment is shifting. The backlash to Altman’s comments was immediate and widespread. People don’t like being reduced to their resource consumption. They don’t like being told their concerns about AI’s environmental impact are “fake.” And they remember which companies dismiss those concerns versus which ones take them seriously.
Transparency gaps are closing. The lack of disclosure requirements for data center energy and water consumption is a policy gap that will eventually close. Companies building AI infrastructure today should assume that environmental reporting requirements are coming, and position accordingly.
The “efficiency” framing cuts both ways. If AI companies want to be measured on per-query efficiency, they should be prepared to disclose those numbers. Currently, most don’t. Google’s environmental report showed a 48% increase in greenhouse gas emissions since 2019, largely due to AI workloads. Transparency invites scrutiny.
Your AI vendor choices have environmental implications. Not all AI providers are equally efficient or equally powered by renewable energy. As environmental disclosure improves, these differences will become competitive differentiators. The questions you should be asking your AI vendors now: What’s your data center PUE (Power Usage Effectiveness)? What percentage of your power comes from renewable sources? What’s your water intensity per query?
The Deeper Issue
There’s a pattern in how AI leaders talk about externalities. Job displacement? AI will create new jobs. Environmental impact? Humans are inefficient too. Safety concerns? We’ll figure it out as we go.
Each of these responses shares a common structure: acknowledge the concern exists, then redirect responsibility elsewhere. The energy sector needs to decarbonize faster. Workers need to reskill. Governments need to regulate better. Meanwhile, the AI companies continue scaling.
Unfortunately, this is a familiar playbook across industries facing accountability for their impacts. But it’s worth noticing, and worth pushing back on, because the stakes keep rising.
Altman is right about one thing: the world probably does need to accelerate the transition to clean energy. But that’s an argument for all of us to make better choices, not an argument that AI companies get a pass on theirs.
Someone on Reddit responded to Altman’s comments with: “Sam watches the Terminator movies and roots for the Terminators.” Maybe a little harsh, but when you defend your product by explaining how inefficient humans are, you might want to think about how that sounds.
Ethicore Advisors Author’s Note
What stuck with me wasn’t the energy math; it was “science and whatever.” That throwaway phrase captures something about how casually Altman treats the accumulated knowledge and effort of human civilization. It’s all just inputs to him. Training data for the next model. Resources consumed on the way to something better.
I’ve spent thirty years in marketing, and I’ve watched a lot of executives defend indefensible positions. The tell is always when they start comparing their product to something sacred (like family, health, or human potential) as if the comparison itself grants legitimacy. The moment you hear “it takes a lot of energy to train a human,” you’re not listening to an argument anymore. You’re listening to someone who’s forgotten what humans are for.


