Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
A long-T radiactive sample was "counted" for one munute. The outcome
was A=145 -->st.dev.=12. Then the background was measured for one
minute yielding B=41 --> st.dev=6.4.
1) The estimated net A-B is 104. What error bar (st.dev.) should be
used for this result. I know it would be 18.4 if the result were
obtained from a multiplication, or from a division, of A and B.
what is the expected standard deviation of a differnce, or sum, in
terms of known sig_A and sig_B?
OK, I'll have a go at it. Use 'd' for full derivative, 'D' for partial.
S = A - B
DS DS
dS = ---- * dA + ---- * dB
DA DB
= dA + dB [it was dA - dB, a typing error ?]
So, assuming gaussian statistics:
|dS| = sqrt((dA)^2 + (dB)^2)