Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] timestamped events; was: Bayesian Inference in Half-Life measurement



Way back on 11/6/18 12:19 PM, Paul Nord wrote:

The experiment starts with a single spinning disk. The student then drops
a second disk on the first (effectively doubling the moment of inertia).

Here's the rub. The variance on the mean should properly be calculated as
sigma/sqrt(n) for a selected set of random samples. But I don't think that
is valid in this case because the samples are correlated. Vernier's
software may even be doing some smoothing and approximation for angular
positions that fall between the position encoder's discrete digitization
steps.

We had a long discussion of this; the gist is that the data was
getting trashed by the Vernier data acquisition system. It insists
on representing things as a frequency, which in this case is the
rotational rate.

On 9/22/21 8:14 AM, Paul Nord wrote:

After removing the copper from the neutron bath we measure the
activity of the sample with a simple geiger counter. Within the
undergraduate lab period we can take data for most of the short half
life decays. Students have to come back every day to take another data
set each of the next few days to get data to find the longer half
life.

Traditionally we have them fit a double exponential function to this
data. But this is fraught with problems and subtle choices in the
analysis which will change the result. And it's not how these half
lives are really measured. Though, obviously in a precision
measurement you'd want to have continuous data collection.

On 9/22/21 8:47 PM, Paul Nord wrote:

You'd really enjoy David MacKay's lecture 10 on Information theory.

========================

Let me address all three of those things at once.

1) MacKay in his lecture said "We don't like bins. We don't like windows".
He said that about ten times, mentioning ten different ways that bins
get you into trouble.

All the data he used in his examples came to us without bins; each and
every point was *timestamped*.

2) As for the spindown of the rotor, I don't want the rotation rate. That
is, I don't want the number of shaft-encoder events in some fixed window.
What I want is the *timestamp* of the shaft-encoder events. Every single
event, timestamped to the nearest microsecond or better (preferably the
nearest nanosecond or better).

I even spoke to the Vernier engineers about this. They said yeah, yeah,
we know what you want and why you want it. That would be nice, but due
to some assumptions that got baked into the system architecture 20 years
ago, we can't do that. (I'm not sure I believe them, but I don't get paid
enough to argue the point or design their equipment for them.)

3) As for the radioactive decay: Once again, I don't want the number of
decays in any 10-minute window, or 1-minute window, or any window at all.
What I want is the *timestamp* of each and every click from the Geiger
counter.

=========

Are y'all beginning to detect a pattern here?

Fun facts:
-- You can get a Raspberry Pi Teensy 4.1 development board for less than
30 bucks:
https://www.ebay.com/itm/184467444434
-- You can get the relevant software for free:
https://github.com/PaulStoffregen/FreqMeasureMulti
https://www.pjrc.com/teensy/td_libs_FreqMeasure.html

Don't be fooled by the name of the software bundle: It says "frequency"
but in the fine print it says it puts a *timestamp* on every input event,
and it will happily give you this information.

The timestamps have nanosecond resolution. I suspect they're reliable
to "only" a fraction of a microsecond ... but compared to the 1 minute
windows being used now, I guess I could be persuaded to settle for a
fraction of a microsecond.

You could spend hundreds or thousands of dollars to get something that
would not work as well. I have not researched it, and don't claim the
solution outlined above is optimal, but it's unlikely that the optimal
setup is significantly cheaper or significantly better.

There is some cognitive workload required to get up to speed with the
Raspberry Pi, but I guarantee you there are folks over in the robotics
department who can answer your questions.

Note that the Geiger counter itself has a dead time on the order of 100
or 200 microseconds, and I guarantee that the Raspberry Pi can handle
each interrupt faster than that. So even though it has a very small
brain, it will not damage the data or limit the data rate.

Presumably you noticed that MacKay didn't mention Poisson statistics in
his lecture. That's because with timestamped data you don't need it. You
never have more than one point in any bin, because there are no bins. So
this approach is fraught with vastly fewer troublesome decisions.

And you can leave the experiment running for 120 hours straight. Record
the events. Every single event, timestamped to the nearest microsecond or
better. No biases. No tradeoffs. No worries.

And you get to apply the same timestamp technique to the shaft encoder.
The device has multiple channels, so you can encode multiple bits of
the Gray code. (You need two, in order to determine direction of rotation.
The rest are icing on the cake.)

If for some reason you want to see what the data looks like in bins, you
can always convert timestamped data to binned data (just not vice versa).

Your students will thank you for this. In the real world, an employee
who knows about timestamping is a lot more valuable than one who thinks
everything needs to be binned.

Also, generally speaking, there are a *lot* of problems around the lab
that can be solved by sticking a microcomputer into the apparatus. This
was a revolutionary idea 50 years ago, but it's pretty much standard
good practice nowadays.