Analog vs Digital Computing
The cluster centers on debates about the superiority, limitations, and philosophical differences between analog and digital computing, including precision, noise, scalability, and why digital dominates despite analog's ties to physical reality.
Activity Over Time
Top Contributors
Keywords
Sample Comments
If analog computers are superior why is everything digital
Neat, but... this "analog" thing's built with a lot of digital circuitry
Digital is just a special case within analog.
Please explain where analog computation has a benefit over digital that outweighs its numerous disadvantages.
Everything is analog when you look at a small enough time scale. "Digital" is an abstraction on top of analog, not a substitute.
This article is bullshit. It fails to mention the many limitations of analog hardware.> Analog signals are continuous waveforms that vary smoothly over time, capturing every nuance of the original sound.> The continuous nature of analog signals means they can theoretically capture an infinite amount of detail. When you play an analog record, for example, the sound you hear is a continuous representation of the original performance.Analog signals get corrupted by noise everywhere.
How do you figure? An analog impression directly mirrors reality, digital is just an approximation. So digital seems inherently inferior.
That seems so awfully analogue... :-/
Reality is infintely analog and therefore digital will only ever be an approximation.
Unfortunately, that's entirely analog. my goal was to do digital computing- with all the reliability and predictability.