The annual energy ChatGPT uses to generate responses is the equivalent of charging 47.87 million smartphones every day for a full year
The very high energy demands of large language models (LLMs) have been laid bare in a report.
The research reveals ChatGPT’s demands on electricity are so high it’s already putting pressure on power grids, raising carbon emissions, and exacerbating environmental issues.
A single ChatGPT query uses 2.9 watt-hours, which is around ten times the energy required for a typical Google search, which uses a comparatively low 0.3 watt-hours.
The total electric power consumption of OpenAI’s chatbot when responding to prompts over the course of a year, as quoted in the report, is approximately 226.82 million kilowatt-hours, solely for answering questions.
ChatGPT has 100 million active weekly users asking an average of 15 questions each week. On a daily basis, this translates to more than 214 million daily requests, consuming over half a million kilowatt-hours of energy.
This figure is more than 21,602 times the daily power usage of an average US household, which consumes around 29 kWh per day.
The annual energy ChatGPT uses to generate responses is the equivalent of charging 47.87 million smartphones every day for a full year.
ChatGPT’s energy use for processing requests surpasses the total electricity consumption of 12 small countries and territories, and could power the entirety of Finland or Belgium for a full day.
The research was distributed by US-based agency NewsworthyData.
The issue of how much energy Generative AI services consume was also investigated in an episode of the BBC Sounds programme, The Artificial Human.
You can listen to the episode, called How green is my AI?, below. It reveals that each request to generate an image using GenAI uses the equivalent amount of energy as charging 50% of a mobile phone’s battery.
No comments yet