AI’s Huge Energy Bill: ChatGPT Uses Way More Power Than Your House

Harvansh
4 Min Read
- Advertisement -

AI is everywhere these days, making things smarter and more automated.

But there’s a downside we don’t often hear about: AI, especially popular tools like ChatGPT from OpenAI, uses a ton of electricity. A report from The New Yorker highlights just how big this issue is.

ChatGPT, with its 200 million daily requests, eats up 500,000 kilowatt-hours every day.

That’s over 17,000 times what an average American home uses in the same period. Imagine the electricity bill!

The Problem Gets Bigger

This is just the tip of the iceberg. If AI keeps getting woven into our daily digital lives, the amount of electricity needed is going to skyrocket.

For example, if Google decided to use AI like this in every search, it would need more electricity in a year than countries like Kenya or Guatemala do.

Alex de Vries, a data scientist, put it simply: AI servers can gulp down as much power as a whole block of houses in the UK.

Training AI: A Power-Hungry Process

When we talk about training AI like ChatGPT, the numbers get even more eye-opening. The energy used to train these models can match the annual electricity consumption of over 1,000 U.S. households. And every day, running these AIs equates to the energy use of about 33,000 U.S. homes. (Source : https://www.washington.edu/news/2023/07/27/how-much-energy-does-chatgpt-use/)

Solutions on the Horizon

The tech industry is starting to think green. Ideas like “following the sun” aim to use solar energy more efficiently for AI training.

Such strategies, along with AI’s potential to help in the fight against climate change, show a silver lining.

AI could help predict extreme weather or make industrial processes less carbon-intensive.

The Role of Renewable Energy

Building data centers in places with abundant renewable energy, like Norway and Iceland, offers another path forward.

These locations can provide green power at lower costs, helping to mitigate the environmental impact of AI’s energy needs.

What’s the Future Look Like?

Predicting how much power the AI industry will use is tough. Every AI model is different, and tech companies don’t like sharing their electricity bills.

But using Nvidia’s data, Alex de Vries estimated that by 2027, AI could be using up to 134 terawatt-hours a year. That’s about half a percent of the entire world’s electricity use.

To put that in perspective, it’s way more than what big companies like Samsung need for all their operations.

No Comment from OpenAI

So far, OpenAI hasn’t responded to these eye-opening reports. The increasing energy demand of AI is a serious environmental concern.

If AI continues to grow at this pace, finding ways to make it more energy-efficient will be key to keeping our planet green.

Wrapping Up

As AI becomes an integral part of our lives, its energy consumption and environmental impact cannot be ignored.

But with innovative cooling techniques, strategic location planning, and a focus on renewable energy, there’s hope for a sustainable future in AI technology.

The challenge is significant, but so is the potential for positive change.

Being an AI lover, What’s your thought about it, Let me know in the Comment box.

- Advertisement -
TAGGED:
Share This Article
Harvansh Chaudhary is an AI enthusiast and entrepreneur, founder of UnrealShot AI, and creator of user-focused AI tools platform Saze AI that simplify tasks and offer practical benefits.