One of the most negative impacts of GenAI is the staggering amount of energy it consumes every time you use it
Every day, ChatGPT uses 100 thousand times more power than a typical household.
According to OpenAI, ChatGPT has around 300 million weekly active users, and these users generate 1 billion queries per day.
A ChatGPT query uses about 0.0029 kilowatt-hours of electricity (whereas Google uses about 0.0003 kilowatt-hours per query), so to process queries and generate answers, ChatGPT uses around 2.9 million kilowatt-hours of energy every day.
The average US household uses about 29 kilowatt-hours of electricity per day, so in a day ChatGPT consumes 100 thousand times more power than this typical household.
Taking this same amount and applying it to a year of ChatGPT use, the total energy consumption is 1,058.5 GWh.
This is equivalent to the power consumption of a small country – in 2022, Barbados consumed around a thousand gigawatt-hours, a little less than a year of ChatGPT use.
The annual amount of energy used by ChatGPT is the equivalent of fully charging all the electric vehicles in the USA around 4.5 times.
Or you could power over 100,000 homes for a year. Or you could fully charge 223 million iPhones every day for the entire year.
And as OpenAI continues to evolve with audio, video, and image generation features, these figures are likely to very quickly ramp up even higher.
Furthermore, OpenAI has developed new reasoning AI models, which are designed to spend more time thinking before responding, inevitably leading to even higher energy consumption per use.
The energy consumption for training GenAI models is also incredibly costly, and the computational power needed for AI’s growth is doubling roughly every 100 days.
In the meantime, the rush is on to develop more energy-efficient hardware and processors to reduce the energy consumption of AI models.
The information used in this article was based on research by BestBrokers.
No comments yet