Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] probability ... was: Bayes Theorem and Medical Tests

On 10/24/18 1:09 PM, Don via Phys-l wrote:

It was nice to see that a little training in probability theory could come
in handy.


Actually, that's quite an understatement. Most of post-1900
physics depends on probability. (Relativity is an exception.)
-- Quantum mechanics, obviously.
-- Statistical mechanics, including diffusion, mobility,
conductivity, viscosity, et cetera.
-- Entropy is defined in terms of probability, and this is
how we understand thermodynamic efficiency, reversibility,
spontaneity, stability, et cetera.
-- Design of experiment, e.g. the Twelve Coins puzzle.

And it's not just physics:
-- We live in an "information age", and information is
intimately related to entropy.
-- Data compression (for transmission and storage).
-- Encryption (for authentication as well as privacy).
-- Physicians aren't the only ones doing statistical inference.
A great deal of AI depends on this.
-- Not to mention elections! Two years ago, the raw polling
data was correct, and Nate Silver analyzed it correctly, and
I said so at the time (in this forum and elsewhere) even though
a great people analyzed it wrong, including people who should
have known better. I'm still disgusted. There's a lot more
I could say about this, but not now.

In my case, it was a required course in probability and
statistics that I took to get a masters in engineering administration (post
PhD) that introduced me to concepts like Bayes' Theorem.

That's about typical.

I'm curious what the undergraduate and graduate schools are doing currently
about instructing physics students in probability and statistics.

FWIW, "probability and statistics" is part of the Common Corpse
standards, from 6th grade onwards.

This is more proof (if any more were needed) that standards
don't mean anything. Nobody teaches to the standards. They
teach to the test. And most of the tests stink.

I'm not sure I would recommend for everybody to take a full
course in probability and statistics. Most of the stuff you
find in statistics books is worthless. Much of it is grossly
out of date. Some of it is just plain wrong. Much of it is
hard to learn because it is buried in weird terminology.

I was given an education that is literally beyond what most
people can imagine. Most of it was pearls before swine, but
I do remember some of it. I saw Bayes' Theorem twice in my
freshman year, once in math class and once in physics lab.
Also I had a couple of friends taking a chemical engineering
course that involved tons of fancy curve fitting, and I helped
them with that, and there's no way of understanding that
without thinking about the probabilities.

Constructive suggestion: Read chapters 13 and 14 in volume II
of Apostol's _Calculus_ book.,%20with%20Applications%20to%20Differential%20Equations%20and%20Probability.pdf

This is not calculus for dummies. This is calculus (including
probability) for smart people, such as the folks on this list.
If you're smart and want to get smarter in a hurry, read this

Apostol uses the modern (post-1933) definition of probability.
This stands in contrast to most other books, which are 85+
years out of date. Also: He assigns the derivation of Bayes'
theorem as a homework exercise. I kid thee not. It's a
trivial exercise, if you understand the objects that appear
in the formula. That's followed by exercises that make use
of the theorem.

My point is, you don't need a full year or even a semester
course in statistics; if you spend a couple weeks on it
you will know more than 90% of the physicists out there,
and more than 99+% of the general population. The stuff
you don't learn in those weeks you can pick up later, on
an as-needed basis.

It takes a bit of effort to reconcile the fancy modern
definition of probability with a physicist's intuition,
but the investment is generously repaid. It revolutionizes
how you think about uncertainty, entropy, and various
other things.