Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: purposefully modifying a distribution to get...?



Mmm ... no one else is participating (I'm surprised since this is LK et al.'s line), I'll make another stab.

Remember I'm (relatively) maths illiterate.

Stefan Jeglinski wrote:

Is the data removed in a non random manner?

Yes. Since the problem occurs at finite array boundaries, any pulses
removed will be clumped around those boundaries in time. IOW, if an
array is 500 units long, and a pulse is on the order of 15 units
long, one would find, at time 500, 1000, 1500 (etc) enforced
deadspots of up to 30 units wide, depending on exactly how 1 (or 2)
pulses overlapped the boundary.

Now there are two cases of removed pulses (data). At boundaries (which I don't understand) and pulse pile up, My impression is that the pulse height (I assume this is the info of interest) is
independent of the boundaries, therefore, no problem. Now if pulse pile up and pulse height is also independent, then also no problem.



Of course, the pulse processor itself can enforce dead spots in the
data stream of much larger than 15 units because of pulse pileup,
especially at high count rates, but those deadspots can occur
anywhere.

e.g. a finite resolving time causes poisson time data to be skewed
(I wrote this a few days ago, including the error functions)

Are you referring to the thread "counts in an interval"?

Yes, there the data of interest is the distribution of intervals. Since only short intervals are "lost" due to finite resolving time, the measured interval distribution result is not "true."
Intervals are thrown out based on the value of the interval, obviously not inde.! I JUST REALIZED THAT throwing out short intervals in the G-M case is worse than I thought. The second pulse, only is
lost, and the next one is treated as the end of the interval. Therefore, not only is the distrib. skewed by removal of short intervals, but incorrectly longer ones are ADDED! A "double skewing."



If I understand your apparatus (pulse pile up is eliminated by removing
coincidences): if the pulse height and the arrival times of the
pulses are independent, I don't think it will matter. Otherwise
you PHA will be incorrect.

They are independent in the sense that in any particular interval,
one cannot predict if the pulse will be an "iron" one or a "copper"
one. One can only try to predict (via a Poisson process) that there
will be "a" pulse. Is there a more rigorous definition of
"independent" here?

I begin to see what the apparatus "does"; I think. The pulse height is a fcn. of the element that "created" it? The particular element's atm. "emitting" the pulse is random, with a rate determined
by the element's concentration. "Sounds" inde. to me.



To return to the dice analogy: if I roll the dice and then flip a
coin to decide whether the keep that datum, this is -roughly-
analogous to what the pulse processor does (-any- particular pulse
may or may not be kept depending on pileup). Whereas I am insisting
on flipping the coin only once every 10 throws (but -always- every
10th throw), to determine whether to throw the 10th, 20th (etc) datum
away.

This also "sounds" inde. Dependency would occur, for example, if you threw out data on heads (only) and if one die was even. As you've suggested, one must collect more data to maintain the std.
deviation.

bc

P.s. you've found an even more expensive method of collecting data than LK's G-M tube for teaching stats! I don't remember the original description (I'm lazy); is the system "like" XRF? In which
case it's atomic probability instead of nuclear, but the maths is of similar form, I think. The Poisson Distrib. is necessary (but not sufficient) to confirm this.

I've just ordered from Vernier a newer version of LK's system. More later. (HS argot: Late.)





Stefan Jeglinski