Brain vs Computer Efficiency

This cluster discusses comparisons between the human brain's computational power, energy efficiency (e.g., 20W vs. megawatts for supercomputers), neuron/synapse complexity, and architectural differences from digital computers like GPUs and LLMs.

➡️ Stable 0.6x AI & Machine Learning
2,990
Comments
20
Years Active
5
Top Authors
#9633
Topic ID

Activity Over Time

2007
7
2008
24
2009
42
2010
58
2011
73
2012
59
2013
59
2014
83
2015
131
2016
200
2017
223
2018
181
2019
202
2020
180
2021
205
2022
203
2023
335
2024
393
2025
320
2026
12

Keywords

S0896 AI technologyreview.com AGI LLM elsevier.com ycombinator.com RL co.uk news.cnet brain human brain human neuron neurons brains power neural computation efficient

Sample Comments

pharmakom Apr 15, 2023 View on HN

Human brains are not built from digital circuits - perhaps they have far more compute than we think.

deadbabe May 18, 2024 View on HN

If human brains are proof you don’t need the energy of a star to do 10^15 operations/second, doesn’t that just mean our understanding of how to build efficient computers is very primitive?

ChatGTP May 31, 2023 View on HN

Wouldn't modelling the human brain mean we'd be using less power? We're using brute force to try get similar results to what the brain does.

Davidzheng Feb 17, 2025 View on HN

I'm not sure there's a single metric in which you can argue a single human brain has more compute than 1e6 H100s

logicchains Sep 8, 2025 View on HN

There's no efficient algorithm for simulating a human brain, and you certainly haven't invented one so you've got absolutely no excuse to act smug about it. LLMs are already within an order of magnitude of the energy efficiency as the human brain, it's probably not possible to make them much more efficient algorithmically.

IgorPartola Nov 14, 2025 View on HN

It is ultimately a hardware problem. To simplify it greatly, an LLM neuron is a single input single output function. A human brain neuron takes in thousands of inputs and produces thousands of outputs, to the point that some inputs start being processed before they even get inside the cell by structures on the outside of it. An LLM neuron is an approximation of this. We cannot manufacture a human level neuron to be small and fast and energy efficient enough with our manufacturing capabilities to

ronald_raygun Feb 18, 2024 View on HN

Sure computes have lots of transistors, but brains have 10s of billions of neurons and only use 12W of power.

mimentum Apr 1, 2020 View on HN

Well considering, many humans solved complex problems for thousands of years, with just the human brain for power, at about 20W of power consumption. One could argue that the human brain, or nature is more efficient at processing than that of a computer of equal power.

bobsil1 Sep 6, 2017 View on HN

Human brain uses 10^5 less power than von Neumann architecture, according to Caltech paper from ‘90s.

wearywanderer May 27, 2021 View on HN

Even human brains, with all their capabilities, get the job done for 20 watts (or something in that order of magnitude.) Who even knows how much power we'd need for equivalent results from the best specialized silicon we have? I think you might burn a hundred gigawatts and still not get comparable results. Something is clearly wrong with our approach. It should be possible to do much more with much less, but there is some piece of the puzzle we're missing.