Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] Re: infinite sig. figs.



Simply repeating the seemingly nonsensical statement:

The contention that the result of a calculation, with
numbers obtained from measurements, is to be interpreted as a
significant number is practically certain to deceive...

does nothing to aid my understanding. I gave, in my posting, a
current example involving the "result of a calculation, with numbers
obtained from measurements." The community is debating whether an
experimental result that lies outside of the uncertainty of the computed
result possibly signals evidence for "new physics". See my paper in
Physics Letters, B523:299-303,2001. [hep-ph/0106145], for a flavor of the
discussion. Note that I am talking about uncertainty - NOT sig digs.

Regards,
Jack

On Thu, 22 Sep 2005, Strickert, Rick wrote:

I totally don't understand Delury (as quoted).

In his paper, "Computations with approximate numbers", _The Mathematics
Teacher_, Vol. LI, No. 7, Nov., 1958, pp.521-30, Daniel Delury states:

"In any event, the theory of errors provides us with the only usable
tool we have for dealing with errors of measurement. This tells us,
among other things, that the treatment of errors cannot be undertaken as
an exercise in arithmetic. The basic requirement is that the program of
measurement be so arranged as to permit the estimation of a standard
deviation, in terms of which the precision of the measuring process can
be stated." (p.525)

"Above all, we must scotch the notion that the precision of an average
or of a single measurement can be judged from the way in which it is
written." (p. 528)

"And after all, what difference does it make if one person runs his
calculations out to, say, two more decimal places than another? As long
as they have not stopped too soon, the basis for a preference between
them is largely aesthetic. Of course, before we can adopt this
indulgent view, we must get rid of the notion, to which we never had any
right, that we are dealing with significant numbers. It is this notion
that has led to the view that too many decimals in the answer imply
deception. As far as deception is concerned, the shoe is entirely on
the other foot. The contention that the result of a calculation, with
i> numbers obtained from measurements, is to be interpreted as a
significant number is practically certain to deceive...

"To sum up, then let us keep the significant number where it belongs, as
a convenient convention for writing answers in pure arithmetic. It has
no other use." (p.530)

Rick Strickert
Austin, TX


--
"Trust me. I have a lot of experience at this."
General Custer's unremembered message to his men,
just before leading them into the Little Big Horn Valley
_______________________________________________
Phys-L mailing list
Phys-L@electron.physics.buffalo.edu
https://www.physics.buffalo.edu/mailman/listinfo/phys-l