Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] Apollo 13 -- 45th anniversary -- intuition, reasoning, training, etc.



In case you missed it, a nice 2-page article:
Lee Hutchinson
"45 years after Apollo 13: Ars looks at what went wrong and why"
http://arstechnica.com/science/2015/04/apollo-13-the-mistakes-the-explosion-and-six-hours-of-live-saving-decisions/

That links to an even nicer longer article:
Stephen Cass
"Apollo 13, We Have a Solution"
http://spectrum.ieee.org/aerospace/space-flight/apollo-13-we-have-a-solution

These articles are AFAICT quite factual, consistent with
some checks I've made, including listening to some of the
original mission audio recordings.

They paint a picture of how science is actually done.
This includes long chains of multi-step reasoning,
roughly like four-dimensional chess only harder ...
not like the typical multiple-guess exam question that
gets answered in one step or not at all. Remember
what Gene Krantz said to all his controllers:
"Okay, let's everybody keep cool... Let's solve the
problem, but let's not make it any worse by guessing,"


There is a proverb that says "Education is the process
of cultivating your intuition." Therefore I reckon
that ideas generally should not be labeled as intuitive
nor as counterintuitive. What's counterintuitive for
one person may be nicely intuitive for a better-trained
person.

It's kinda strange to see Modeling Instruction touted as
a "new" thing in the education business. The NASA guys
had modeling and simulation down to a fine art 50 years
ago. The astronauts got a ticker tape parade, and rightly
so ... but none of the missions would have been possible
without the simulator guys who trained the flight crews
and controllers.
-- Apollo 11: "1202 alarm ... we're go on that, Flight"
-- Apollo 12: "SCE to aux"
-- Apollo 13: pulling the LM up by its bootstraps,
without power from the CM ... and other
feats too numerous to mention.

Stuff like that does not happen by accident; it comes
from being really smart and really well trained.

I reckon intuition is "mostly" a good thing, if applied
to a fair random sample of situations. HOWEVER, one
should be extremely wary of applying intuition in non-
random situations, especially in adversarial situations
where the other guy can maneuver you into making an
intuitive but spectacularly wrong decision. A big
part of real expertise and real professionalism is
knowing when /not/ to shoot from the hip.

An excellent resource for driving home this point is
Daniel Kahneman
_Thinking, Fast and Slow_
http://us.macmillan.com/thinkingfastandslow/danielkahneman

Kahneman is a giant in the field, and he has been stomping
around long enough that not everything in the book will
seem new, but it's still important.

Also, I found it helpful to ignore the title and ignore
most of what he says about System 1 and System 2 ... that
is, to ignore most of the /explanations/ for the data.
Kahneman occasionally concedes that the explanations are
oversimplified. Still the book is an immense compendium
of useful data about illusions, delusions, and fallacies,
i.e. situations where people /predictably/ make terrible
decisions.

The book is does not offer much constructive pedagogical
advice about how to train people to do better, but even
so, highlighting the nature and magnitude of the problem
is an important step in the right direction.