
When GPT-5 landed on the scene in August 2025, AI fans were awestruck by its leap in intelligence, subtle writing, and multimodal capabilities. From writing complex code to solving graduate-level science questions, the model broke boundaries for what AI can accomplish. And yet, in the wings, a high-stakes battle waged not over what GPT-5 could do, but what it requires to enable those achievements.
OpenAI, recognized as a leader in artificial intelligence, made a contentious decision: it would not disclose the energy consumption figures of GPT-5, a departure from openness that is causing concern among many researchers and environmentalists. Independent benchmarking by the University of Rhode Island's AI Lab indicates the model's output may require as much as 40 watt-hours of electricity for a normal medium-length response several times more than its predecessor, GPT-4o, and as much as 20 times the energy consumption of previous versions.
This increase in power usage is not a technical aside. With ChatGPT's projected 2.5 billion daily requests, GPT-5's electricity appetite for one day may match that of 1.5 million US homes. The power and related carbon footprint of high-end AI models are quickly eclipsing much other consumer electronics, and data centres containing these models are stretching local energy resources.
The Call for Accountability
OpenAI's refusal to release GPT-5's energy consumption resonates throughout the sector: transparency is running behind innovation. With AI increasingly integrated into daily life supporting physicians, coders, students, and creatives society is confronting imperative questions: How do we balance AI's value against its carbon impact? What regulations or standards must be imposed for energy disclosure? Can AI design reconcile functionality with sustainability?
Let's keep asking not just "how smart is our AI?" but also "how green is it?" As the next generation of language models emerges, those questions could set the course for the future of this revolutionary technology.
OpenAI, recognized as a leader in artificial intelligence, made a contentious decision: it would not disclose the energy consumption figures of GPT-5, a departure from openness that is causing concern among many researchers and environmentalists. Independent benchmarking by the University of Rhode Island's AI Lab indicates the model's output may require as much as 40 watt-hours of electricity for a normal medium-length response several times more than its predecessor, GPT-4o, and as much as 20 times the energy consumption of previous versions.
This increase in power usage is not a technical aside. With ChatGPT's projected 2.5 billion daily requests, GPT-5's electricity appetite for one day may match that of 1.5 million US homes. The power and related carbon footprint of high-end AI models are quickly eclipsing much other consumer electronics, and data centres containing these models are stretching local energy resources.
What powers the boom?
Why the sudden such dramatic growth? GPT-5's sophisticated reasoning takes time-consuming computation, which in turn triggers large-scale neural parameters and makes use of multimodal processing for text, image, and video. Even with streamlined hardware and new "mixture-of-experts" models that selectively run different sections of models, the model size means resource usage goes through the roof. Researchers are unanimous that larger AI models consistently map to greater energy expenses, and OpenAI itself hasn't published definitive parameter numbers for years.The Call for Accountability
OpenAI's refusal to release GPT-5's energy consumption resonates throughout the sector: transparency is running behind innovation. With AI increasingly integrated into daily life supporting physicians, coders, students, and creatives society is confronting imperative questions: How do we balance AI's value against its carbon impact? What regulations or standards must be imposed for energy disclosure? Can AI design reconcile functionality with sustainability?Learning for the Future
The tale of GPT-5 is not so much one of technological advancement but rather one of responsible innovation. It teaches us that each step forward for artificial intelligence entails seen and unseen trade-offs. If the AI community is to create a more sustainable future, energy transparency could be as critical as model performance in the not-so-distant future.Let's keep asking not just "how smart is our AI?" but also "how green is it?" As the next generation of language models emerges, those questions could set the course for the future of this revolutionary technology.
Disclaimer Statement: This content is authored by a 3rd party. The views expressed here are that of the respective authors/ entities and do not represent the views of Economic Times (ET). ET does not guarantee, vouch for or endorse any of its contents nor is responsible for them in any manner whatsoever. Please take all steps necessary to ascertain that any information and content provided is correct, updated, and verified. ET hereby disclaims any and all warranties, express or implied, relating to the report and any content therein.