Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: counts in an interval



For the benefit of "lurkers", I post rather than write to LK and cc to JD.

Using the data supplied to the list in the first thread post from LK, I calculated the rate to be 13.35... the method is inserted below. (Part of msg. to LK)
-----------------------
I've finally analyzed your data that John did earlier. I obtain a similar result. However, I don't know how he obtained 2.77 counts per delta t of 0.2 sec. I calc'd from your
data a rate of 13.352 cps. here's how I did it: total counts is 7009 (zero X 123 + one X 467 + two X 674 + ....) and the tot. time is 2628 one fifth sec. intervals = 525.6 S. Counts per sec then
13.3352...

rt then 2.667 JD gets (or uses?) 2.77. using w/o rounding in formula calc'd m = 0 => 6 the sum of which is 0.9779 not a good sign?

Is this the same data as in your (joint) paper? No time periods with 7 counts? P (7, t) = 0.01318... which is ~ 1 / 3 of P (6, t). P (8,t) is near zero ~4.9E-3.
--------------------------------
cont'd. If one includes P (7 & 8,t) the sum is ~ 0.996 now internally consistent? Adding in the additional time and counts (P (7...) -- .013... X 2628 X 0.2 = 6.928... s. & P (8 ...) --
0.00439... X 2628 X 0.2 = 2.3.... s. or 534.6 S. new tot. and new tot. counts (e.g. 0.013 X 2628 X 7 = 243 <rounded up>) is 243 + 92 + 7009 = 7344 or rate = 7344 counts / 534.6 s = 13.74 <rounded>
, rt = 2.75; much closer, but still not JD's And new m = 0 is 171 instead of old calc'd 181 -- exptl. is 123. I suppose by iteration one could eventually get a fit. using a NLLSq fit would be
much quicker -- JD did that?

I think the cause of the disagreement is:

A, insufficient data (P (7 .....etc.)
B, Not only may the counter's non zero resolution be dropping counts, but the computer scalar also.

It's not quickly obviously to me which direction this skews the result, and if it would be sig. Remember that the resolution of LK's counter is probably ~ half a millisec.

bc

P.s. UCSC's Phys. intermediate "practical" includes this collection of data and analysis. (required) <not the time interval analysis just the frequency one, however, the text during the 80's was
Melissinos, which included a detailed description of an interval analysis using a strip chart recorder and background from a G-M counter>. The lab. (rooms) have (had) at least five MCA/MCS's; their
use was proscribed! The student must laboriously collect the hundred or more datum "by hand." using scalars (actually frequency meters with ten or one sec. gate. If they were to use the MCS's, they
would then down load the data to the CATS server, write an AWK program to reduce the data, and a C program to complete the analysis including an NLLSq fit and chi-square test. The "philosophy" of the
intermediate lab. has been to eschew "black boxes", obviously!

Ludwik Kowalski wrote:

John Denker wrote:

Observation: If we "round off" 2.77 to 3, then the
middle rt value is 5 times bigger than the small one, and
5 times smaller than the big one. This leads to a hypothesis,
namely repeated mistakes in converting between "counts per
bin" and "counts per second". However, my crystal ball
is a bit murky at the moment, so other possible explanations
can't be ruled out.

Not murky at all. I was not able to check everything, but
one thing is clear, the mean counting rate was 11.8 and not
3.08. I wish I could honestly say that "it was a typo". In any
case let us compare experimental data and predictions now.

According to Poisson formula [g(m,t) = (rt)^m exp(-rt) / m!]
with r*t=11.8*0.2=2.36 one finds:

m=0 --> 253 outcomes (instead of experimental 123)
m=1 --> 596 outcomes (instead of experimental 467)
m=2 --> 704 outcomes (instead of experimental 674)
m=3 --> 554 outcomes (instead of experimental 604)
m=4 --> 327 outcomes (instead of experimental 503)
m=5 --> 154 outcome (instead of experimental 172)

This is reasonable but far from being satisfactory. Let me
check this on another saved experimental file (Timing 3).
It had 7004 events. The first 20 events are shown below;
these are times of arrivals in seconds. The last digits are
likely to be affected by rounding or truncations. As you
can see, there were 5 counts in the first 0.3 s interval,
3 in the next, 4 in the next, etc.

The first waiting time was only 3 ms, the next was 70 ms,
the next was 25 ms, etc. The distribution of waiting times
is exponential, the distribution of counts per dt=0.3 s has
the poissonian shape. The mean counting rate was 12.59
cnts/sec. Do the experimental data agree EXACTLY with
the Poisson formula? I will be happy to send this text file
to anybody who asks for it. Use any software you want
and share observations.

The only person who responded to my question about
this activity being useful was John D. Do you agree with
his statement that this is mathematics, not physics? I
disagree; counting experiments are the ways to explore
the random nature of radioactive transformations. How
else can it be explored?

Would you use the described activity with students if
the program to collect data on the arriving times (and to
display the distributions while data are collected) were
built-in into the Vernier software? As I wrote before,
I am trying to persuade Vernier to add such routine to
their package. So please answer this question on
Phys-L; Vernier people are physics teachers and some
of them are on our list. Elaborate on your opinion and
suggest what else would you do with such software,
if it were made available.

The Timing3 file (see the sample below) was checked for
the constancy of counting rates. I did this to be sure that
nobody changed geometry (moved something) when I
was in the next room. The mean cnts/sec at the end of
the run were practically the same as at the beginning.
Ludwik kowalski
0.048
0.051
0.121
0.146
0.283
0.320
0.494
0.506
0.670
0.711
0.733
0.797
0.916
1.009
1.035
1.039
1.180
1.276
1.440
1.483