Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

what good is "percentage error"?



It's time to stir the phys-l pot again.

Today I was looking through a commercial high school physics level lab
manual, and came across the instruction to perform an experiment once
and then determine the "percentage error" by calculating

(accepted - measured)/accepted * 100%.

Is this quantity even being taught? How can it be used? It seems to me
that it is useless at best and misleading at worst, and should never
be used.

It does not measure the precision of a measurement in any way, being
based on a single measurement. A small error could be due to a high
quality experiment or it could be due to luck. Only multiple
measurements can establish the precision of the experiment. (I include
the internal statistics from a curve fit as "multiple measurements.")

It does not measure systematic error, again since it is based on a
single measurement. The deviation from an accepted value is not useful
since the random error is also present.

It does imply to the students that it is a measure of the quality of
the experiment, or of their skill. Since the "percentage error"
measures neither of these, it is a misleading quantity.

Even after discussing concepts of random errors, standard deviations,
and such, I've had to work to keep many students in freshman-level
university physics from including the calculation in their lab reports
as a measure of their experimental success, so I know that the
quantity is being taught and emphasized out there.

Why? What good is it? Am I missing something?

JEG

==================
John E. Gastineau (304) 296-1966
900 B Ridgeway Ave. gastineau@badgerden.com
Morgantown, WV 26505
www.badgerden.com/~gastineau