AI Energy Consumption
The cluster debates the energy usage and environmental impact of AI models and LLMs, comparing it to activities like Netflix streaming, cars, and human cognition, while questioning if it's excessive or negligible relative to benefits.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Reminder that LLMs only(?) consume energy on the order of a few seconds of Netflix[1].[1]: https://bsky.app/profile/simonwillison.net/post/3m6qdf5rffs2...
Be aware of the energy you are wasting, a modern AI uses much less power than a human.
Computers use a lot of energy (that 1000 watt power supply). Consider choosing more environmentally friendly options. Use a lower level language like c instead of python or JavaScript for a program that will be run by many people and machines. An order of magnitude or two of energy savings is possible. Use and build command line interfaces instead of conversational interfaces. Google recently said as to why they built TPUs, "If we considered a scenario where people use Google voice sear
Running ollama to compute inference uses energy that wouldn't have been used if you weren't running ollama. There's no free lunch here.
The quality concerns are absolutely justified. The complaints about energy use sound like unfounded, extremely far-fetched arguments just used by people who don't like LLMs for other reasons.The inference energy cost is likely on the same order of magnitude as the computer + screen used to read the answer (higher wattage, but much shorter time to generate the response than to formulate the request and read it).The training energy cost is significant only if we ignore that it is used
Yes. Cars use more energy than LLMs, we don't mind those.What does "a lot of energy" even mean? It's meaningless. Do they use too much energy for their usefulness? If they did, people wouldn't be using them.
What about energy consumption as a proxy for it ?
it does not use "some" resourcesit uses a fuck ton of resources[0]and instead of reducing energy production and emissions we will now be increasing them, which, given current climate prediction models, is in fact "killing the planet"[0] https://www.iea.org/reports/energy-and-ai/energy-supply-for-...
I doubt it frankly. Computation consumes a lot of energy, true, but it is dwarfed by how much energy we use in transportation and food production. Energy use per capita in most of the Global North is about 75,000kWh per year,https://ourworldindata.org/grapher/per-capita-energy-useThat's like the average person running 27 NVIDIA A100s at max capacity at all times!
Probably not as much as you think: https://www.sustainabilitybynumbers.com/p/ai-energy-demandYou are better off worrying about your car use and your home heating/cooling efficiency, all of which are significantly worse for energy use.