Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] "dealing" with v. noisy data.



For a number of years I’ve been analysing data collected from a very high Q pendulum, as a result the data is very noisy because the pendulum is “in air”.

The data collected is the amplitude inferred from the speed of a flag interrupting an IR pencil at (near) BDC.

The speed does NOT always monotonically decrease! I have variously averaged or smoothed (a running average), either the interrupt time, inferred amplitude (Period / interrupt time), the square of the amplitude, the differences of the squares, or the resulting Q [2PI * A^2 / diff (A^2)]

I finally found it obvious that averaging before squaring is a no,no, as doing so produces a cross product.

The principle problem is when adjacent data values are nearly the same. Should I average to the point it doesn’t produce an outlier, or delete one? I suppose it’s OK to always delete when adjacent ones are the same (with in the resolution of the PG). Really, this happens every run. (Many runs have > 5k points with Q’s > 5k.)

I suppose my prob. is I’m loath to delete data that is real. i.e. if anything, adjacent pairs nearly the same are just as valid as when the amplitude increases during a free decay.

bc