Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] the logic behind some illogic

Teaching physics is
-- partly about physics, which is very logical, but also
-- partly about teaching, which is tricky, because humans
are not particularly logical creatures.

My point for today is that some of the illogical behavior
-- including some tremendously infuriating and destructive
behavior -- can be understood as a rather slight mutation
of traits that we all possess, to one degree or another.

For starters, the /teamwork/ instinct is high on the list
of traits that can be used for good or ill. Sometimes it
is good to go along with the team's plan, even if the plan
does not seem 100% logical to you. There is some higher-
order logic that says suboptimal team action is sometimes
more effective than disorganized action, but most people
don't analyze it that closely. They just know that it
"feels good" to be part of the team.
a) This is in our genes. People who could never go along
with a dubious plan died out long ago.
b) OTOH, some people express this trait to a greater
degree than others.

Teamwork is a tool, and like any tool it can be used wisely
or unwisely.

We can gain some insight if we look at a non-extreme and
non-political example. Let's start with something you see
in science class, namely significant figures. Even if you
don't teach this in your class, there is a good chance that
students have been infected with it in some previous course.

Sig-figs dogma requires rounding off to the point where
roundoff error comes to dominate other sources of error.
One major "advantage" is that everybody gets the /same/
answer. It's the wrong answer, due to excessive roundoff
error -- sometimes spectacularly wrong -- but as long as
everybody gets the /same/ wrong answer everybody feels
happy. (Everybody except for the customer who needed the
actual factual right answer.....)

This supports the point I am making:
a) To one degree or another, /everybody/ values "fitting
in" with the crowd, even at the cost of getting lots of
incorrect answers. This is in our genes.
b) Different people exhibit this trait to different degrees.

As another famous example of the same idea:

Question: Does this sweater make me look fat?
Answer #1: No.
Answer #2: The fact that you're fat makes you look fat.
That sweater just makes you look purple.
-- BtVS season 4 ep 1.

The second answer is scientifically accurate, but not tactful.

Again: The "teamwork" instinct is a tool, and like any tool
it can be used wisely or unwisely. There are many limitations
and caveats.

One limitation is that sometimes, as in the sig-figs example,
unanimity is just completely wrong. A noisy signal is still a
signal, and you cannot throw away the noise without butchering
the signal.

Here's another limitation: If you are going to play follow-
the-leader, it is your strict duty to choose a good leader.
Otherwise the "teamwork" instinct can lead to large groups
of people doing unspeakably evil things.

Here's yet another limitation: Some people have a hard time
figuring out where the team boundaries should be. I am
reminded of the old Bedouin saying:
"I against my brother,
my brothers and I against my cousins,
my cousins and I against the world."

You see this in industry, where intramural back-stabbing is
not exactly unheard-of. A big part of management's job is
to get people to pull together and focus on the /external/
challenges. You see the same problem in academia, often
exacerbated by an astonishing lack of effective management.
Destructive partisanship and factionalism is not a new problem;
Justinian had a problem with it:
See also George Washington's farewell address:

Again note the a/b contrast:
a) This is a non-partisan issue insofar as all parties,
to one degree or another, tend to engage in partisanship
and factionalism.
b) This becomes a partisan issue when different parties
exhibit this trait to wildly differing degrees. You
have one party that occasionally does what's best for
the country, while another party cannot think about
anything except destroying the other party, no matter
what the cost to the country as a whole.
b') We also see the same thing at a higher layer, where
one country takes action that seems (?) to further its
short-term ultra-nationalist ambitions, but puts the
whole world in jeopardy.

On 07/21/2014 07:02 PM, Bernard Cleyet mentioned the 2006
book by Chris Mooney:
_The Republican War on Science_

That's an amusing book, but it has become a bit dated. Chris
Mooney has changed his mind on a few things since 2006. He
has written another entire book since then, plus a number of
articles. I actually know the guy a little bit. He's plenty
smart. He's not just some random guy who writes opinions.
He actually does research, i.e. sociology experiments.

He would argue that the title of that book is inflammatory,
and therefore counterproductive. If you have a bunch of
people who have adopted a position for partisan reasons, if
you tell them they are wrong Wrong WRONG, they just take it
as proof that you are not on their team. It hardens their
resolve to oppose you. As a friend of mine once said:
You cannot make a turtle come out of its shell by poking.

BC also mentioned some alarming poll results. Chris Mooney
would argue that these are not as alarming as they seem, or
at least that the nature of the problem is different from
how it appears. First of all, note that there is no penalty
for lying to public-opinion pollsters. Why, then, would
anybody give the scientific answer? It makes perfect sense
to give the answer that you think advances your partisan or
sectarian goals.

In particular, recent studies show that if you ask about global
warming, or the age of the earth, or who was the worst president
since WWII, the people who give ridiculous answers are generally
/not/ stupid and /not/ even ignorant of the facts. You you can
ascertain this by asking slightly different questions. The point
is, they just don't care about the facts. In such a situation,
browbeating them with the facts will not change their opinions,
not even a little bit. As the saying goes:
You can't reason someone out of a position
that they weren't reasoned into.

Mark Twain defined /faith/ as "believing something you know
ain't true." That sounds like coldly calculated lying, but
the people who engage in it don't see it that way. They say
"This is what I believe, and I'm telling you what I believe,
so it's not lying."

People are really adept at this. It's a cultural universal.
It's in our genes. It is only a rather slight exaggeration of
the eminently necessary "teamwork" trait. Science is all about
/not/ believing in stuff that ain't true, but this does not
come naturally. Most people are not very scientific.

Meanwhile, there is also quite a lot of coldly calculated
lying going on. I could cite some recent examples, including
some that have already been mentioned on this list, but I'd
rather not, in hopes of steering the discussion in a relatively
non-political direction.

We begin see the outline of an answer to the question I asked
in my previous, ill-considered note. Why would anybody cobble
up a story that is obviously 180 degrees diametrically false
... and why would they expect to get away with it? I don't
fully understand it, but some partial answers include:
*) There is a lot of tribalism going on, tribalism of the
kind reflected in the aforementioned Bedouin saying. Faction
X can easily be persuaded to do what seems best for that
small faction, if it it brings disaster to all members
of the larger group, X included.
*) There is a lot of coldly calculated lying going on. People
lie whenever they think they can get away with it, if they
think it will give them a personal, factional, partisan,
and/or sectarian advantage.
*) The people who make up stories can get away with it
because people believe what they want to believe. The
"teamwork" instinct is powerful enough to trample the
*) They can get away with it if/when we let them get away
with it.

I could write another long essay about the pros and cons of'
lying. It's a powerful but clumsy and dangerous tool. But
let's not go there right now.

At this point, the question remains, what should we do
about this? The short answer is that I don't know. I've
made enough mistakes to prove that my instincts can't be
trusted. However, in hopes of starting a useful discussion,
let me offer a few conjectures and hypotheses:

First of all: We ought to recognize the limitations of the
scientific method. The age of enlightenment is over. The
ideas of Descartes, Locke, and Jefferson have mostly lost
out to the ideas of Machiavelli, Rand, and Rove. If we apply
the scientific method to the question of whether the scientific
method works, we have to conclude that it isn't very robust.
It is great for persuading your fellow scientists ... just
not for other 99.99% of the population.

Also: Keep in mind the prevalence of tribalism and factionalism.
One way to make progress is to get people to shift their notion
of where to draw the team boundaries. Get them to consider
what's best for the larger team, not just the narrow faction.

I don't know exactly how to do this. The direct confrontational
approach generally does /not/ work. Again: You can't make
a turtle come out of its shell by poking. Sometimes you can
make progress if you can convince people that you are part of
their team, and then work from the inside.

Another suggestion: Don't give up. Just because certain
people cannot be persuaded by the evidence does not mean
they cannot be persuaded. In fact, people who exhibit herd
mentality are eminently persuadable. They are dogmatic, but
they can change their dogma at the drop of a hat. I could
cite spectacular examples from recent politics, but I'm
trying to steer clear of that, so let me cite a different

Obviously if you can directly persuade the /tribal leaders/
in such a situation, you're miles ahead.

Again: Don't give up. Register and vote. You might say
that all politicians are selfish, short-sighted, lying
bastards, and I might not disagree except to point out
that it is a matter of degree, and the degrees matter.
A lot. Some of them occasionally try to do the right
thing in policy terms.

Don't tolerate tribalism. Never vote for the guy who says
his #1 priority is to destroy the leadership of the other
party. Support the guy who gives priority to good policy
and good governance.

Demand responsible behavior from your own team. It is OK
/to some degree/ to go along to get along, but there comes
a point where you have to say hey, such-and-such is flagrantly
dishonest and bad policy, and I'm not going to be part of it.

Don't tolerate lying, cheating, or sabotage in class. In
(say) poker, bluffing is part of the game, but in science
it is not. (Expect to have a hard time explaining this to
the pre-law students.) Scientists generally don't lie to
each other, and when you do experiments, Mother Nature
does not lie to you.

OTOH, don't be naïve. Real life plays by different rules.
People lie to you all the time, if they think they can get
away with it. James Randi says that scientists are easier
to fool than children ... for the simple reason that scientists
grow too accustomed to not being lied to. Randi knows what
he is talking about, since he is an expert in the arts of
deception. In addition to being a famous skeptic, he is
an outstanding magician.

Recognize that /fear/ is a big causative factor. It tends
to promote excessive tribalism and factionalism. Some people
go through life perpetually terrified. They're afraid of
black people, afraid of brown people, afraid of poor people,
afraid of gay people, afraid of criminals, afraid of the
police, afraid of chaos ... but most of all afraid of their
own weakness, i.e. afraid that /they/ might do something wrong.
We need these people to just calm down already. Alas, shouting
at them that they need to *CALM DOWN* doesn't help. I don't
claim to be an expert on this, but on occasion I have been
called upon to get control of a situation where people were
screaming and crying, literally incapacitated by fear, even
though the situation wasn't really all that bad. Once I
told a story about a /really/ scary, off-scale painful, life-
threatening situation that I had to deal with. I compared
this to their current situation: We were stuck out in the
wilderness and we weren't going to get home on time, but
nobody was hurt, nobody was going to get hurt, we weren't
out of food, we weren't out of water, and we weren't even
out of communication. So calm down already. Stop crying
and start doing your chores, so we can make the best of the

All in all, teaching people to not be afraid is tricky.
It's about the trickiest thing I can think of. Way
trickier than quantum fluid dynamics. Gary Larson sends
us Prof. Gallagher's controversial method:

This is all the more complicated when some jerks think
it is in their interest to stir up more fear and more

Last but not least: Teach your students to be skeptical to
an appropriate degree. Teach them to recognize when they
are being lied to and manipulated. This is one of the big
reasons why science is useful! Science is more than knowing
that F=ma and knowing the mass of the electron. There is a
logic that holds it all together, and this is super-valuable
as a defense against manipulation. I am reminded of one of
the immortal scenes from /Stand and Deliver/. Claudia is
trying to get a signature on a school permission slip:

Claudia's Mother: Boys don't like if you're too smart.

Claudia: Mom, I'm doing this so I don't have
to depend on some dumb guy the rest
of my life.

Claudia's Mother: ........

At this point you might expect some sort of dialog, some
sort of comeback, but instead we get some good writing,
directing, and acting. We get no dialog at all. Claudia's
mom just stares for half a beat, then signs the slip.

The point here is that although there is a powerful good
feeling that comes from being part of the team, there is
also a good feeling that comes from knowing you can think
for yourself, you can't easily be manipulated, and you don't
have to go through life dependent on whatever the various
untrustworthy "authority" figures are telling you.

One approach to teaching this is to assign each student to
choose something and fact-check it. Some possible choices
*) Homeopathic medicines.
*) Copper bracelets (for arthritis therapy).
*) Magnetic bracelets (for arthritis therapy).
*) The difference between "energy drinks" and "energy bars".
*) The physics behind the "Fizz Keeper" device.
*) Pomegranate/blueberry juice containing 0.5% pomegranate
and blueberry juice ... combined!
*) Shoes that burn calories and tone and strengthen muscles,
eliminating the need to go to the gym. It has to be true,
since Kim Kardashian said so.
*) The six-step cut-and-dried "scientific method"
*) Sig figs.
*) Any of the hundreds of false statements in your physics text.
*) Check out Retraction Watch, but keep it in perspective: The
number of retractions is ultra-tiny compared to the total number
of published papers.
*) et cetera ad infinitum.............

Examples drawn from current politics provide numerous juicy
targets. They are hard to resist, but I recommend resisting
them anyway ... especially for students who are just getting
started in the fact-checking business. Like the aforementioned
Gary Larson / "Prof. Gallagher" technique, such examples are
likely to be counterproductive.