Report: Creating a 5-second AI video is like running a microwave for an hour

We know that AI tools like ChatGPT consume significant energy, and a new report from the MIT Technology Reviews shows exactly how much.

May 22, 2025 - 02:00
 0
Report: Creating a 5-second AI video is like running a microwave for an hour
The ChatGPT AI (Chat GPT) logo is seen displayed on a smartphone screen.

You've probably heard that statistic that every search on ChatGPT uses the equivalent of a bottle of water. And while that's technically true, it misses some of the nuance.

The MIT Technology Review dropped a massive report that reveals how the artificial intelligence industry uses energy — and exactly how much energy it costs to use a service like ChatGPT.

The report determined that the energy cost of large-language models like ChatGPT cost anywhere from 114 joules per response to 6,706 joules per response — that's the difference between running a microwave for one-tenth of a second to running a microwave for eight seconds. The lower-energy models, according to the report, use less energy because they uses fewer parameters, which also means the answers tend to be less accurate.

It makes sense, then, that AI-produced video takes a whole lot more energy. According to the MIT Technology Report's investigation, to create a five-second video, a newer AI model uses "about 3.4 million joules, more than 700 times the energy required to generate a high-quality image". That's the equivalent of running a microwave for over an hour.

The researchers tallied up the amount of energy it would cost if someone, hypothetically, asked an AI chatbot 15 questions, asked for 10 images, and three five-second videos. The answer? Roughly 2.9 kilowatt-hours of electricity, which is the equivalent of running a microwave for over 3.5 hours.

The investigation also examined the rising energy costs of the data centers that power the AI industry.

The report found that prior to the advent of AI, the electricity usage of data centers was largely flat thanks to increased efficiency. However, due to energy-intensive AI technology, the energy consumed by data centers in the United States has doubled since 2017. And according to government data, half the electricity used by data centers will go toward powering AI tools by 2028.

This report arrives at a time in which people are using generative AI for absolutely everything. Google announced at its annual I/O event that it's leaning into AI with fervor. Google Search, Gmail, Docs, and Meet are all seeing AI integrations. People are using AI to lead job interviews, create deepfakes of OnlyFans models, and cheat in college. And all of that, according to this in-depth new report, comes at a pretty high cost.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.