Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] arithmetic



Back in high-school geometry class, I was starting to
learn how to do proofs. The teacher remarked that proofs
are what mathematicians do for a living. He said a real
mathematician could prove one plus one equalled two.

I marvelled at that for a long time. I took the teacher
at his word, but I had no idea what the words meant. I
cut myself on both edges of a two-edged sword:
-- I couldn't see how it could be _necessary_ to prove
that one plus one equalled two, and
-- If it turned out to be necessary, I couldn't see how
it would be _possible_ to prove it.

To say the same thing more compactly: Where could you
possibly start that was so weak that it didn't already
have 1+1=2 built in, yet was so powerful that it could
(a) express and (b) prove 1+1=2.

I later learned an answer (not the only possible answer)
can be found in the _Peano axioms_. I've found on the
web a remarkable variety of different ways of stating the
axioms; the best version I've seen so far is:
http://www.answers.com/topic/peano-axioms

A more-detailed discussion including some worked examples
at an easy-to-understand level can be found in Hofstadter
_Gödel, Escher, Bach_. The proof that 1+1=2 appears
without much fanfare on page 219. It's 7 lines long.

I realize this book is not to everyone's taste, but
the people who like it reeeally like it. IMHO it
richly deserved its Pulitzer prize. Also: the book
is 25 years old, and it is curious what has changed
and what has not changed during that time ... notably
the status of Fermat's last theorem.


I mention Peano arithmetic as a constructive counter to
the claim that a mathematician
a) would define addition as "fundamental", and
b) would define multiplication as iterated addition.

I don't think either (a) or (b) is necessary or even
typical.

For one thing, surely counting with tally-marks is even
more fundamental than addition:
0 exists by fiat
1 := successor of 0 == S0
2 := successor of 1 == SS0
3 := successor of 2 == SSS0
etc.
so if you really care about what's fundamental, before
you defined multiplication as iterated addition you
would define addition as iterated succession, as
iterated tally-counting.

But if you REALLY cared about what's fundamental, you
probably wouldn't do either of those things, because
it's not easy to define what is meant by "iteration"
without relying on an already-existing arithmetic system
(which would pretty much defeat the purpose, which
was to define arithmetic).

Instead, it is easy and sufficient to define addition
and multiplication by specifying how they interact
with the identity, with the successor operation, and
with each other:
(0 + a) = a for all a
S(a + b) = (a + Sb) for all a,b
(0 * a) = 0 for all a
(a * Sb) = ((a * b) + b) for all a,b
and that's all that need be said on the subject; the
symmetric, associative, and distributive properties
can be derived from there.

The take-home lesson I would like students to get from
this is not the exact statement of the axioms, but
rather the point that the following ideas are really
fundamental:
-- additive identity, additive inverse
-- multiplicative identity, multiplicative inverse
-- associative, distributive, commutative

As previously mentioned, in some districts these ideas
are supposed to be covered in 4th grade and reinforced
thereafter.

When they get to high school, the kids are supposed
to learn the axiomatic approach to geometry. IMHO
the geometry _per se_ is not nearly so important as
learning about axioms and proofs. If they can handle
the axiomatic approach to geometry, it wouldn't kill
them to take an axiomatic approach to arithmetic.

In contrast, the notion of multiplication as iterated
addition is vastly less satisfactory. In particular,
when you get to multiplying vectors (via the wedge
product or /gasp, choke/ the cross product) or
multiplying matrices, iterated addition won't do
the job; it will only give you the product of
vector times scalar, not vector times vector.

So now maybe you can see where I've been trying to
go with this: The idea that you can define multiplication
in terms of its identity, associative, distributive, etc.
properties is a keeper!

In contrast, the idea of multiplication as iterated
addition is a non-keeper.

While we're in the neighborhood: Defining division
in terms of the multiplicative inverse is a keeper.

In contrast, as previously discussed, the idea of
division as iterated subtraction, in the same sense
that multiplication is iterated addition, is a
non-keeper, and indeed a non-starter. IF (big
if!) you insist on thinking about multiplication
in terms of iterated addition, it involves the
statement
add this, _so-many_ times.
In contrast, division involves the question,
add this, _how-many_ times?
The change from imperative to interrogative is
the key idea; changing from "add" to "subtract"
completely misses the point.

In any case, it is best to stay out of the iterated
blah-blah quagmire and just define division in terms
of the multiplicative inverse.

Bottom line: identity, inverse, associative, distributive,
commutative or anti-commutative ... these are simple
ideas with tremendous power.

============================

Pedagogical point: I'm not saying that intro-level
students need to have a sophisticated view of the
subject. I am saying that it is important for the
_teacher_ to have a sophisticated view. The reason
is, given N different ways of introducing the subject,
we want to introduce it in a way that is at least
_consistent_ with and _compatible_ with revisiting
the subject at a deeper level later.

I'm not saying the kids should learn everything at
once. I am saying that we want to minimize the
amount of _unlearning_ that has to occur.