Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: resolution vs. precision



In 1966 I got my first digital voltmeter. It had limited resolution,
with a least count of one microvolt. When shorted out and zeroed the
Nixies stayed fixed on all zeros with no jitter. I conjected that
measurement of the voltage of any source that was much quieter than
the meter's intrinsic noise could be improved by adding a noise
source to the input and then treating the readings (with their newly
added jitter) statistically, averaging a few hundred readings to
obtain a reading to one more place than the display showed. I never
reduced this "Noise Assisted Digital Voltmeter" to practice, but I
did do the math as an exercise. I now know that I could have done a
higher precision measurement that way than I could have done using
the meter in a more obvious manner.

An almost unrelated curiosity: why do some sources persist in
calling almost any high input impedance DVM a VTVM?

Leigh