Brain vs Computer Efficiency
This cluster discusses comparisons between the human brain's computational power, energy efficiency (e.g., 20W vs. megawatts for supercomputers), neuron/synapse complexity, and architectural differences from digital computers like GPUs and LLMs.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Human brains are not built from digital circuits - perhaps they have far more compute than we think.
If human brains are proof you don’t need the energy of a star to do 10^15 operations/second, doesn’t that just mean our understanding of how to build efficient computers is very primitive?
Wouldn't modelling the human brain mean we'd be using less power? We're using brute force to try get similar results to what the brain does.
I'm not sure there's a single metric in which you can argue a single human brain has more compute than 1e6 H100s
There's no efficient algorithm for simulating a human brain, and you certainly haven't invented one so you've got absolutely no excuse to act smug about it. LLMs are already within an order of magnitude of the energy efficiency as the human brain, it's probably not possible to make them much more efficient algorithmically.
It is ultimately a hardware problem. To simplify it greatly, an LLM neuron is a single input single output function. A human brain neuron takes in thousands of inputs and produces thousands of outputs, to the point that some inputs start being processed before they even get inside the cell by structures on the outside of it. An LLM neuron is an approximation of this. We cannot manufacture a human level neuron to be small and fast and energy efficient enough with our manufacturing capabilities to
Sure computes have lots of transistors, but brains have 10s of billions of neurons and only use 12W of power.
Well considering, many humans solved complex problems for thousands of years, with just the human brain for power, at about 20W of power consumption. One could argue that the human brain, or nature is more efficient at processing than that of a computer of equal power.
Human brain uses 10^5 less power than von Neumann architecture, according to Caltech paper from ‘90s.
Even human brains, with all their capabilities, get the job done for 20 watts (or something in that order of magnitude.) Who even knows how much power we'd need for equivalent results from the best specialized silicon we have? I think you might burn a hundred gigawatts and still not get comparable results. Something is clearly wrong with our approach. It should be possible to do much more with much less, but there is some piece of the puzzle we're missing.