Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Image by UMA media on Pexels.

GPT-5.5 launched this week and the company says it can finish your job for you, but scientists say the hidden cost is something you drink every day

OpenAI’s latest model, GPT-5.5, launched this week with the company billing it as their most capable release to date. As detailed by LADbible, GPT-5.5 is designed to handle multi-step tasks autonomously, researching online, operating software, and moving across tools until a job is completed. What receives far less attention is the resource cost attached to every interaction users have with systems like this one.

Recommended Videos

OpenAI has positioned GPT-5.5 as a partner for computer-based work, capable of navigating ambiguity and executing complex tasks without step-by-step instruction. The company describes it as their smartest and most intuitive model yet. That level of capability, however, depends on an infrastructure that consumes resources at a scale most users never see.

The most immediate environmental cost is water. Reports indicate that for every 10 to 50 queries sent to a system like ChatGPT, roughly 500ml of water is consumed indirectly through the cooling systems required to manage hardware heat inside data centers. Chilled water is pumped continuously through these facilities to absorb the intense thermal output generated by the computing equipment processing each request.

The electricity demands of AI data centers are growing faster than the grid can accommodate them

Electricity consumption is the larger concern. Data centers globally consumed an estimated 460 terawatt-hours in 2022, placing them 11th among the world’s largest electricity consumers. Projections cited by MIT suggest that figure could approach 1,050 terawatt-hours by 2026, which would vault data centers into fifth place globally, between Japan and Russia. The power density required for generative AI workloads is estimated to be seven to eight times higher than conventional computing, and much of that energy still comes from fossil fuel-based power plants.

Individual query costs compound the problem. A single prompt to a tool like ChatGPT is estimated to consume roughly five times more electricity than a standard web search. Each new model release also renders the energy spent training previous versions effectively obsolete, contributing to a cycle of waste that accelerates with every competitive product launch. Amid broader concerns about how tech companies handle the downstream consequences of their products, Palantir employees questioning their company’s ethics have drawn attention to a wider pattern of AI-adjacent industries facing internal scrutiny over the real-world effects of their work.

There are also indirect costs tied to hardware. The high-performance GPUs required to run these models demand complex fabrication processes, the use of toxic chemicals, and mining operations that carry their own ecological footprint. Millions of these units are shipped to data centers annually, with associated emissions from manufacturing and transport adding to the cumulative impact. The tech industry’s relationship with user data has come under scrutiny on other fronts as well, with JetBlue sued for harvesting customer data to manipulate ticket pricing, reflecting a broader pattern of opaque data practices across the sector.

OpenAI and other companies have argued that AI holds the potential to accelerate climate solutions, pointing to partnerships with institutions such as US National Laboratories focused on energy breakthroughs. As Elsa A. Olivetti, a professor at MIT, notes, the environmental consequences of generative AI are systemic and persistent, extending well beyond the electricity consumed at the point of use. The gap between that stated potential and the current reality of daily, large-scale usage remains significant.


Attack of the Fanboy is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
More Stories To Read
Author
Image of Saqib Soomro
Saqib Soomro
Politics & Culture Writer
Saqib Soomro is a writer covering politics, entertainment, and internet culture. He spends most of his time following trending stories, online discourse, and the moments that take over social media. He is an LLB student at the University of London. When he’s not writing, he’s usually gaming, watching anime, or digging through law cases.