Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-l] Significant figures -- again



On 03/09/2012 07:56 PM, sclarkphd@mindspring.com wrote:
Many of my students write every digit from their calculator which
most probably indicates that they don't understand uncertainty in
measuring. I don't think this is harmless; students need to indicate
in their responses that there is a limit to the precision of their
answers. I don't take off points for this, but I do comment on it.


Possibly constructive suggestion: Treat this as three separate
problems:
a) not understanding uncertainty in measurement
b) writing down loads of insignificant digits
c) not indicating the uncertainty in the measurements

1) In particular, there are several ways in which (c) is different
from (b). Most importantly, we should never "comment" or do
anything else that would lead students to associate the number
of digits with the overall uncertainty. That's because the
number of digits might (but might not) tell you the amount of
roundoff error ... and the roundoff error might (but might not
and should not) be the dominant contribution to the overall error.

As another way of seeing the distinction between (b) and (c),
suppose the desired answer is 1.234 ± 1%.
-- If the student writes 1.234 ± 1% that's fine.
-- If the student writes 1.2343 ± 1% that's fine.
-- If the student writes 1.2343066051 ± 1% that's maybe a
little bit eccentric, but since the uncertainty was clearly
indicated I don't see any real grounds for complaint.
-- If the student writes simply 1.234 that's not OK. The
problem is not the number of digits that are present; the
problem is the "± 1%" indication that is absent.
-- If the student writes simply 1.2343 that's not OK. It's
not worse than 1.234, but it's not better, because the
important "± 1%" indication is still missing.

To say the same thing the other way: The student who is writing
down all the digits from the calculator is probably not associating
the number of digits with the uncertainty, and this is a good thing!
Don't mess this up!

There are many, many reasons why people should be allowed to write
down numbers without being forced to imply anything about the
uncertainty, let alone the significance. Very commonly I write
down raw data with no indication of uncertainty, because it will
be many weeks or months before I actually know the uncertainty.

Along the same lines, consider numbers such as 2.54 cm per inch
or 299792.458 km per light-second. Counting the decimal places
tells you /nothing/ about the uncertainty. Those numbers are
exact. No roundoff, no uncertainty of any kind.

If the students don't see those numbers as expressing any implied
uncertainty, that's a good thing! Don't mess this up!


2) As a separate issue, (a) is different from (b) and/or (c).
That is, there are issues surrounding the _idea_ of uncertainty
that are separate from how you should express the uncertainty.

One way to address this is to ask the student to account for the
various sources of uncertainty. This is a teachable moment, on
the topic of how to combine uncertainties, and on the topic of
how to plan an experiment to make good use of resources.

Imagine a situation where the uncertainty in the raw measurement
is 1% and the uncertainty contributed by roundoff error is 0.00001%.
Then the combined uncertainty is approximately 1.00000000005%,
since we expect the contributions to add in quadrature.

Scenario #1: Suppose we wanted to reduce the overall uncertainty.
Writing one more digit reduces the roundoff error by a factor of
ten, so the uncertainty budget goes from
1% & 0.00001% = 1.00000000005%
to
1% & 0.000001% = 1.0000000000005%
which is a rather small improvement, less than one part in 10^13
of the nominal value.

Note that I am using "&" as a mathematical operator meaning
"addition in quadrature". That is, a & b is by definition
equal to sqrt(a^2 + b^2). Homework: Prove that "&" is
associative and commutative, and that multiplication distributes
over "&".

To the student I would say, if it took you only 1 second to write
down the extra digit, your return on investment is lousy. It's
not a good use of your time, especially when compared to the
alternative:

Scenario #2: Suppose you took a couple of minutes to make the raw
measurement more accurately, so that the error was only 0.9% rather
than 1%. This takes you a couple orders of magnitude more time
and effort compared to scenario #1, but it gives you 9 orders of
magnitude more payoff. It's a much better use of your time.

Let's be clear, dear student: I don't have a problem with reducing
the roundoff error by writing extra digits. Really I don't. The
time spent writing down those digits is trivial compared to the time
we have already spent discussing the topic.

The point is not that the extra digits hurt, because they don't.
The point is that they just don't help. If you want to improve
the overall uncertainty, you need to look elsewhere. More generally,
you need to understand /why/ those digits don't help, and why you
need to pay more attention to other sources of uncertainty.