HomeTechnologyDecoding GPT-5's Energy Drain: A Wake-Up Call for Sustainable AI

Decoding GPT-5’s Energy Drain: A Wake-Up Call for Sustainable AI

Published on

The release of OpenAI’s GPT-5 is a monumental step forward in AI, but it is also a sobering reminder of the industry’s growing environmental impact. As the company remains tight-lipped on the model’s resource usage, experts are raising serious concerns. They argue that GPT-5’s enhanced features, from website creation to solving complex academic questions, come with a steep and unprecedented energy cost. This lack of transparency from a major AI developer is prompting a crucial discussion about the industry’s commitment to building a sustainable future.
A key piece of evidence comes from a study by the University of Rhode Island’s AI lab, which found that generating a single medium-length response of about 1,000 tokens with GPT-5 can consume an average of 18 watt-hours. This is a dramatic increase from previous models. To put this into perspective, 18 watt-hours is the equivalent of an incandescent light bulb burning for 18 minutes. Given that a service like ChatGPT handles billions of requests daily, the aggregate consumption could be staggering, potentially reaching the daily electricity demand of millions of homes.
The increase in energy use is directly linked to the model’s size and complexity. Experts believe GPT-5 is significantly larger than its predecessors, with a greater number of parameters. This aligns with a study by French AI company Mistral, which found a strong correlation between a model’s size and its energy consumption. A model 10 times bigger, the study concluded, will have an impact that is an order of magnitude larger. This principle seems to be holding true for GPT-5, with some experts suggesting its resource use could be “orders of magnitude higher” than even GPT-3.
Compounding the issue is the new model’s architecture. While it does use a “mixture-of-experts” system to improve efficiency, its reasoning capabilities and ability to handle video and images likely counteract these gains. The “reasoning mode,” which involves the model computing for a longer time before generating a response, could make its resource footprint several times greater than text-only operations. This combination of size, complexity, and advanced features paints a clear picture of an AI system with a voracious appetite for power, leading to urgent calls for greater transparency from OpenAI and the broader AI industry.

popular articles

Tech trends 2022: Web 3.0, big tech battles

Following a year that saw WFH (work from home) and metaverse become household terms,...

Metaverse, an online virtual world

The term "metaverse" was coined by author Neal Stephenson in his 1992 science fiction...

In a first, a Strange Quantum Object created in Lab

In the realm of quantum mechanics, which governs the behavior of the Universe at...

MIT develops New Programming Language for High-Performance Computers

In the realm of computing, the demand for high performance is ever-increasing, particularly for...

More like this

The Power of AI Collaboration: Microsoft’s Diagnostic Orchestrator

Microsoft has showcased the power of AI collaboration with its new "diagnostic orchestrator," a...

Elon Musk’s Legal Threat to Apple Over App Rankings Ignites Feud with Sam Altman

The tech world is abuzz with the latest clash between Elon Musk and Sam...

Apple’s Big Secret is Out: Here’s Everything Coming After the iPhone 17

Apple may have just given us a look at its secret hardware plans for...