AI News

AI’s Huge Energy Bill: ChatGPT Uses Way More Power Than Your House

Published on :
WhatsApp Channel Join Now
Telegram Channel Join Now

AI is everywhere these days, making things smarter and more automated.

But there’s a downside we don’t often hear about: AI, especially popular tools like ChatGPT from OpenAI, uses a ton of electricity. A report from The New Yorker highlights just how big this issue is.

ChatGPT, with its 200 million daily requests, eats up 500,000 kilowatt-hours every day.

That’s over 17,000 times what an average American home uses in the same period. Imagine the electricity bill!

This is just the tip of the iceberg. If AI keeps getting woven into our daily digital lives, the amount of electricity needed is going to skyrocket.

For example, if Google decided to use AI like this in every search, it would need more electricity in a year than countries like Kenya or Guatemala do.

Alex de Vries, a data scientist, put it simply: AI servers can gulp down as much power as a whole block of houses in the UK.

When we talk about training AI like ChatGPT, the numbers get even more eye-opening. The energy used to train these models can match the annual electricity consumption of over 1,000 U.S. households. And every day, running these AIs equates to the energy use of about 33,000 U.S. homes. (Source :

The tech industry is starting to think green. Ideas like “following the sun” aim to use solar energy more efficiently for AI training.

Such strategies, along with AI’s potential to help in the fight against climate change, show a silver lining.

AI could help predict extreme weather or make industrial processes less carbon-intensive.

Building data centers in places with abundant renewable energy, like Norway and Iceland, offers another path forward.

These locations can provide green power at lower costs, helping to mitigate the environmental impact of AI’s energy needs.

Predicting how much power the AI industry will use is tough. Every AI model is different, and tech companies don’t like sharing their electricity bills.

But using Nvidia’s data, Alex de Vries estimated that by 2027, AI could be using up to 134 terawatt-hours a year. That’s about half a percent of the entire world’s electricity use.

To put that in perspective, it’s way more than what big companies like Samsung need for all their operations.

So far, OpenAI hasn’t responded to these eye-opening reports. The increasing energy demand of AI is a serious environmental concern.

If AI continues to grow at this pace, finding ways to make it more energy-efficient will be key to keeping our planet green.

As AI becomes an integral part of our lives, its energy consumption and environmental impact cannot be ignored.

But with innovative cooling techniques, strategic location planning, and a focus on renewable energy, there’s hope for a sustainable future in AI technology.

The challenge is significant, but so is the potential for positive change.

Being an AI lover, What’s your thought about it, Let me know in the Comment box.

Was this article helpful?

Yes No

AI-Q is supported by it's audience. We may earn affiliate commissions from buying links on this website.

Harvansh Chaudhary is the Founder and Author of, His passion for artificial intelligence drives him to delve into new AI tools, offering detailed articles and reviews for both enthusiasts and professionals.

Leave a Comment