The Price of Progress: GPT-5’s Energy Demands Under Scrutiny

by admin477351

OpenAI’s GPT-5 has been hailed as a major leap forward in AI, but the celebration is being tempered by concerns over its immense energy consumption. While the company has provided no official data on the model’s resource use, independent researchers are stepping in to fill the information gap. Their findings suggest that the new model’s enhanced capabilities come at a steep environmental price, raising critical questions about the long-term sustainability of the AI industry.

The energy figures are eye-opening. A team at the University of Rhode Island’s AI lab has found that a medium-length response from GPT-5 uses an average of 18 watt-hours. This is a substantial increase from past models and is “significantly more energy than GPT-4o,” according to one researcher. To put this in perspective, 18 watt-hours is enough to power an incandescent light bulb for 18 minutes. Considering that ChatGPT reportedly handles 2.5 billion requests a day, the cumulative energy consumption could power 1.5 million US homes, a figure that brings the environmental cost of AI into sharp focus.

This surge in power consumption is a direct result of the model’s size and complexity. While OpenAI has not released the parameter count for GPT-5, experts believe it is “several times larger than GPT-4.” This belief is supported by a study from the French AI company Mistral, which established a “strong correlation” between a model’s size and its energy consumption. The study found that a model ten times bigger would have an impact one order of magnitude larger. This suggests that the trend of building ever-larger AI models, championed by many in the industry, will continue to drive up resource usage.

The new features of GPT-5 further contribute to its high energy demands. Its “reasoning mode” and ability to process video and images require more intensive computation than simple text generation. A professor studying the resource footprint of AI models noted that using the reasoning mode could increase resource usage by a factor of “five to 10.” This means that while a “mixture-of-experts” architecture offers some efficiency, the new, more complex tasks are driving the overall energy footprint to new heights. The urgent calls for transparency from AI developers are a direct response to this growing environmental concern.

You may also like