Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] points don't have error bars (distributions do)



On 10/26/2012 06:40 AM, Folkerts, Timothy J wrote:

I was thinking of a slightly different issue, I suspect. I have no
problems with your additional comments about errors associated with
a specific measurement made under a specific set of circumstance. I
would even say it is a hallmark of an "expert" to naturally think
about uncertainty and to be able to estimate the magnitude of that
uncertainty in common situations (without needing to collect 1000
data points to estimate the true distribution).

If/when the expert can estimate the uncertainty of the distribution
without seeing many (or any) data points drawn therefrom, it just
underscores once again the point that the uncertainty is a property
of the distribution, not any particular point drawn therefrom.

However, the graphs you link to have changed. The first time I
looked at the figures you linked to, there was a curve being fit to a
set of data points, with error bands on the curved fit to the set of
points. (From my interpretation of your post) you were comparing
the spread of the *curve* to the spread of the *points*.

That was never the intent. Actually, I don't see how that could
ever make sense, for the following reason:

Suppose we have a mathematical function f such that y=f(x).
We can read off x from the x-axis of the graph, so x must
be considered known. Therefore f(x) is known. Therefore
y is known. You can /either/ think of y as having error bars
/or/ think of y as being f(x) ... but not both.

As a constructive suggestion that may clarify this idea, let's
use capital Y to represent a distribution centered on the point
y=f(x). Subject to mild restrictions, we can write Y concisely
as Y = y ± σ. This should make it clear that f(x) tells us y
but does not tell us σ.

To say the same thing yet another way, yes there is some "spread
in the curve" but this spread contributes nothing to the uncertainty.
The spread in f(x) is completely certain.

I have redrawn one of the figures and added additional figures and
words to make it super-explicit that the x-dependence is irrelevant
to the point I am making: The idea of uncertainty applies to
distributions, not any particular point drawn from the distribution.
http://www.av8n.com/physics/probability-intro.htm#sec-bars


PS In industrial settings, this is part of a process called "Gage
R&R" or "Gauge R&R" (repeatability and reliability). There they are
concerned about different people with different instruments being
able to measure different parts in a consistent, accurate manner.

Let's be clear: If we write Y = y ± σ, where y is a function of x,
it may be that σ is also some function of x, such that
y = f(x) [not the uncertainty]
σ = g(x) [the uncertainty]

The fact remains that f(x) and g(x) are different concepts. They might
be related in some way (e.g. Poisson processes) or completely unrelated
(e.g. IID Gaussian processes). AFAICT the "spread" of the curve f(x) is
not a good way to think about uncertainty.