Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: purposefully modifying a distribution to get...?



> >Is the data removed in a non random manner?
>
> Yes. Since the problem occurs at finite array boundaries, any pulses
> removed will be clumped around those boundaries in time. IOW, if an
> array is 500 units long, and a pulse is on the order of 15 units
> long, one would find, at time 500, 1000, 1500 (etc) enforced
> deadspots of up to 30 units wide, depending on exactly how 1 (or 2)
> pulses overlapped the boundary.

Now there are two cases of removed pulses (data). At boundaries
(which I don't understand) and pulse pile up,

Yes, this point has gotten lost a bit. In a real instrument, the only
cause of "arbitrary" pulse rejection is pulse pileup. I am creating a
simulation for a software project that will operate on the pulses. At
this point the software itself is also creating an imaginary pulse
stream. While the software will simulate pulse pileup events, it is
hampered by finite array size, meaning that the software must look at
snapshots, or windows if you will, of data. But since the software is
creating pulses via a Poisson subroutine, it sometimes places a pulse
at the very end or very beginning of a window. This artifact is
purely one of finite memory available. Each pulse is 15-20 memory
locations wide. A pulse pileup may be 20 or even 200 memory locations
wide. If any of these memory locations are cutoff by either the
beginning or end of a window (no more memory, until the current
window can be analyzed), the whole pulse, piled up or not, must be
discarded.

My impression is that the pulse height (I assume this is the info
of interest) is independent of the boundaries, therefore, no
problem. Now if pulse pile up and pulse height is also
independent, then also no problem.

All independent, at least in my understanding of the word "independent."


> Of course, the pulse processor itself can enforce dead spots in the
> data stream of much larger than 15 units because of pulse pileup,
> especially at high count rates, but those deadspots can occur
> anywhere.
>
> >e.g. a finite resolving time causes poisson time data to be skewed
> >(I wrote this a few days ago, including the error functions)
>
> Are you referring to the thread "counts in an interval"?

Yes, there the data of interest is the distribution of intervals.
Since only short intervals are "lost" due to finite resolving time,
the measured interval distribution result is not "true."
Intervals are thrown out based on the value of the interval,
obviously not inde.! I JUST REALIZED THAT throwing out short
intervals in the G-M case is worse than I thought. The second
pulse, only is
lost, and the next one is treated as the end of the interval.
Therefore, not only is the distrib. skewed by removal of short
intervals, but incorrectly longer ones are ADDED! A "double
skewing."

I must review this and determine if it applies to my situation. I'm
not sure that it does.


> > If I understand your apparatus (pulse pile up is eliminated by removing
> >coincidences): if the pulse height and the arrival times of the
> >pulses are independent, I don't think it will matter. Otherwise
> >you PHA will be incorrect.
>
> They are independent in the sense that in any particular interval,
> one cannot predict if the pulse will be an "iron" one or a "copper"
> one. One can only try to predict (via a Poisson process) that there
> will be "a" pulse. Is there a more rigorous definition of
> "independent" here?

I begin to see what the apparatus "does"; I think. The pulse height
is a fcn. of the element that "created" it? The particular
element's atm. "emitting" the pulse is random, with a rate determined
by the element's concentration. "Sounds" inde. to me.

Exactly.


> To return to the dice analogy: if I roll the dice and then flip a
> coin to decide whether the keep that datum, this is -roughly-
> analogous to what the pulse processor does (-any- particular pulse
> may or may not be kept depending on pileup). Whereas I am insisting
> on flipping the coin only once every 10 throws (but -always- every
> 10th throw), to determine whether to throw the 10th, 20th (etc) datum
> away.

This also "sounds" inde. Dependency would occur, for example, if
you threw out data on heads (only) and if one die was even. As
you've suggested, one must collect more data to maintain the std.
deviation.

Yes, this is a good comparison I think.


P.s. you've found an even more expensive method of collecting data
than LK's G-M tube for teaching stats!

-Much- more expensive!

I don't remember the original description (I'm lazy); is the
system "like" XRF? In which
case it's atomic probability instead of nuclear, but the maths is of
similar form, I think. The Poisson Distrib. is necessary (but not
sufficient) to confirm this.

Yes, very similar. FWIW, the project is the creation of a pulse
processor with virtually zero deadtime. IOW, do not reject the pulse
pileup events, but rather analyze them in real time to separate them
out, recovering the individual pulses that created the pileup event
to begin with. It turns out there are very few "pathological" pileups
that cannot be disentangled.


Stefan Jeglinski