Everyone is worried about AI’s growing impact on the environment. That’s warranted, but the reality is that watching Netflix for an hour creates 500 times more CO₂ than sending two text prompts to Gemini or ChatGPT.
The International Energy Agency says that data centers, AI and crypto together used about 460 terawatt hours of electricity in 2022, and that is exploding to over 1,000 TWh by 2026. And the U.S. Department of Energy says data centers consumed about 4.4% of all U.S. electricity in 2023 and this should hit 6.7 to 12% by 2028. So AI and data centers are sucking up more and more of our electricity, even to the point of big tech companies exploring the use of nuclear reactors to feed our digital bellies.
But blaming AI is way too easy, says a new report by TRG Datacenters. Many of the other digital-centric activities we love are potentially much worse.
Like just watching YouTube, or participating in a Zoom conference call. An hour bingeing Netflix consumes .12 kilowatt-hours, releasing 42 grams of CO2 into the environment. An hour on Zoom is more efficient, taking just .05 kWh and releasing 17 grams of CO2.
Here’s the relative impact to our environment of common digital activities:
- YouTube or Netflix, 1 hour (HD)
~0.12 kWh → 42 g CO₂
Tied for the dirtiest single activity in the study. - Text-to-video generation, 6–10 seconds
~0.05 kWh → 17.5 g CO₂
Roughly the same as an hour-long Zoom call. - Zoom, 1 hour
~0.0486 kWh → 17 g CO₂ - Short email, no attachment
~0.0133 kWh → 4.7 g CO₂
One email is tiny, billions per day are not. - AI image generation, 1 image
~0.003 kWh → 1 g CO₂ - Voice assistant query (Alexa/Siri/etc.)
~0.0005 kWh → 0.175 g CO₂ - Google search or AI chatbot prompt
~0.0003 kWh → 0.105 g CO₂ - Two Gemini prompts
~0.00024 kWh → 0.084 g CO₂ total (~0.042 per prompt)
Clearly, asking AI a few questions – even using it consistently through the working day – is not as bad as multiple other activities. There is an AI exception, of course, and that is text to video generation, which is a very heavy lift and burns data center energy fast.
(There’s another big factor the study did not look at: training AI models, which is a massive power-draining task working hundreds of thousands of high-end GPUs for months on end.)
It’s not as if any digital use is necessarily green activity, of course. The tech sector as a whole uses a huge amount of energy which is a significant contributor to global pollution.
But there are some options.
“The tech sector added roughly 900M tons of CO2 to the atmosphere in the last year, comparable to the annual emissions of Germany,” a TRG Datacenters spokesperson said in an emailed statement. “By the end of 2025, that number is expected to exceed 1.2 billion tons. The issue isn’t whether we use technology (that’s inevitable) but how we power it. Right now, only about 30% of datacenter energy comes from renewables. If we all gather efforts, and get that to 80% or 90%, we would cut the carbon footprint of every digital activity by more than half, without anyone changing their behavior.”
That’s the key: moving to solar, wind, wave and hydro power, plus any other renewable forms of energy. (Iceland famously uses geothermal energy, for example.)
It’s important to start soon: a 2025 Goldman Sachs report says all forms of AI will drive a 165% increase in data center electricity usage by 2030. The more of this power that is renewable, the better.
Apple is one giant tech company that is leading the way. It currently powers all of corporate operations (offices, retail stores, data centers) with 100% renewable electricity and has an “Apple 2030” goal of making its entire business, including manufacturing, supply chain and customer use carbon neutral within five years.
And initiatives like Google’s new seventh-generation Ironwood AI chips are both more powerful and more energy efficient, reducing demand for power while satisfying our ever-increasing appetites for generated answers and images.
Which means we just might be able to have our digital cake and eat it too.


