Precision vs Accuracy
The cluster discusses the distinction between precision and accuracy in scientific measurements, with commenters criticizing overly precise claims that ignore large error margins, significant figures, and methodological limitations.
Activity Over Time
Top Contributors
Keywords
Sample Comments
... isn't our error margin much larger than this?
What's the point in measuring it with high precision if it's not accurate?
Measurement errors of course...\s
It sounds like you're reading that as a margin of error. Some places they measured a factor of 10, other places they measured a factor of 1000.
You are confusing precision with accuracy
Assuming we are talking about software, error between measurements is a direct function of the device you use to measure, which is itself close to perfectEven if we go to the example you give, the measurement should be done n times, each reporting the exact result found like 51.0 51.9 51.95 etc. Even if the decimals are outside the smallest interval of your ruler: take enough of them and you can get closer to the actual length which may be 51.55345 and that you would never have been able to m
Would higher precision of measurements help here?
Ah ok, I guess I just overestimated the level of precision they were working with.
That's precision, not accuracy.
It's an embarassing problem to have a system with "accuracy to high to measure"!