AI Energy Consumption

The cluster debates the energy usage and environmental impact of AI models and LLMs, comparing it to activities like Netflix streaming, cars, and human cognition, while questioning if it's excessive or negligible relative to benefits.

➡️ Stable 1.6x AI & Machine Learning
3,646
Comments
20
Years Active
5
Top Authors
#2927
Topic ID

Activity Over Time

2007
2
2008
12
2009
19
2010
18
2011
18
2012
30
2013
25
2014
28
2015
36
2016
73
2017
75
2018
96
2019
195
2020
254
2021
293
2022
267
2023
413
2024
669
2025
1,043
2026
82

Keywords

e.g IT LLM AWS on.com bsky.app AI simonwillison.net www.iea CO2 energy co2 ai token energy use electricity llms power watt google

Sample Comments

bcye Dec 3, 2025 View on HN

Reminder that LLMs only(?) consume energy on the order of a few seconds of Netflix[1].[1]: https://bsky.app/profile/simonwillison.net/post/3m6qdf5rffs2...

zepolen Aug 31, 2022 View on HN

Be aware of the energy you are wasting, a modern AI uses much less power than a human.

dferince Apr 23, 2017 View on HN

Computers use a lot of energy (that 1000 watt power supply). Consider choosing more environmentally friendly options. Use a lower level language like c instead of python or JavaScript for a program that will be run by many people and machines. An order of magnitude or two of energy savings is possible. Use and build command line interfaces instead of conversational interfaces. Google recently said as to why they built TPUs, "If we considered a scenario where people use Google voice sear

fragmede Jan 4, 2026 View on HN

Running ollama to compute inference uses energy that wouldn't have been used if you weren't running ollama. There's no free lunch here.

tgsovlerkhgsel Apr 15, 2024 View on HN

The quality concerns are absolutely justified. The complaints about energy use sound like unfounded, extremely far-fetched arguments just used by people who don't like LLMs for other reasons.The inference energy cost is likely on the same order of magnitude as the computer + screen used to read the answer (higher wattage, but much shorter time to generate the response than to formulate the request and read it).The training energy cost is significant only if we ignore that it is used

stavros Jul 5, 2024 View on HN

Yes. Cars use more energy than LLMs, we don't mind those.What does "a lot of energy" even mean? It's meaningless. Do they use too much energy for their usefulness? If they did, people wouldn't be using them.

aprdm Oct 9, 2025 View on HN

What about energy consumption as a proxy for it ?

insane_dreamer Dec 27, 2025 View on HN

it does not use "some" resourcesit uses a fuck ton of resources[0]and instead of reducing energy production and emissions we will now be increasing them, which, given current climate prediction models, is in fact "killing the planet"[0] https://www.iea.org/reports/energy-and-ai/energy-supply-for-...

xmcqdpt2 Mar 8, 2023 View on HN

I doubt it frankly. Computation consumes a lot of energy, true, but it is dwarfed by how much energy we use in transportation and food production. Energy use per capita in most of the Global North is about 75,000kWh per year,https://ourworldindata.org/grapher/per-capita-energy-useThat's like the average person running 27 NVIDIA A100s at max capacity at all times!

yen223 Aug 10, 2025 View on HN

Probably not as much as you think: https://www.sustainabilitybynumbers.com/p/ai-energy-demandYou are better off worrying about your car use and your home heating/cooling efficiency, all of which are significantly worse for energy use.