Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: resolution vs. precision



I would distinguish between the "Readout Resolution" and the "Measurement
Resolution"(Accuracy?) of an instrument/measurement. This is particularly
relevant in digital readouts, where the final voltage developed within the
instrument may be accurately measured and reported to more digits than are
"significant" to the overall measurement, due to noise within the
instrument/process, etc. There is a naive temptation to assume that all
digits displayed on a digital meter readout are "significant". Eg, the
ohmmeter readouts of a digital multimeter may be thus "over designed".
The specs of a decent multimeter will specify both "Resolution" and
"Accuracy". Unscrupulous mfgrs will not address this issue and report
many worthless digits, bragging about a high resolution.

In the analog world this is analogous to assuming no warping/etc in a
meter stick and reading out to a few tenths of the smallest division. In
well designed analog instruments the human error in "reading the dial" is
the "dominant" error - in digital readouts this error is zero, and
instrument errors dominate.

Even in the calculational world, this is somewhat analogous to taking all
digits reported by a digital calculator as always "significant",
regardless of the inputs to the calculation.

Bob

Bob Sciamanda (W3NLV)
Physics, Edinboro Univ of PA (em)
trebor@velocity.net
http://www.velocity.net/~trebor