Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] treating force as a vector ... consistently



On 09/07/2016 06:37 PM, Moses Fayngold wrote:

At this point, I agree with Richard Tarara (as well as with Bob
Sciamanda and Philip Keller) that "...this is not the only place that
physics fudges with the math without damage to the physics".

In all the examples I can think of, playing fast and loose with
the math /does/ cause some amount of "damage to the physics."

Sometimes cutting corners is necessary as a starting point, for
pedagogical reasons or otherwise, but the starting point should
not be the ending point. We should not make a virtue of dirty
tricks. It pays to spiral back and knock the dirt out of the
dirty tricks.

I can think of several examples where physics has, at some point,
used methods that did not conform to the mathematical standards
of the day.

A conspicuous example is the Dirac delta function. Mathematicians
complained it wasn't really a function. Dirac didn't care. It
took the math community many years to come up with a theory that
explained what a delta-distribution really is, and how to use it
properly.

Another example is calculus. Calculus is math, but Newton invented
it for a reason. He invented it so he could do physics. The odd
thing is that the principal results in the _Principia_ are
proved using 2000-year-old Euclidean geometry techniques. Some
bystanders suggested that Newton didn't trust his own methods, but
there is a simpler explanation: He knew that none of the /readers/
would trust the new methods. And indeed it took the math community
quite a while to construct a firm logical foundation for calculus.

Also mathematicians fought for 250 years about the meaning of
"probability" before Kolmogorov came up with a set of axioms that
everybody was happy with.

Mathematicians fought for 2000 years about the meaning of "infinity".

Another example is vectors. Maxwell obviously had some intuitive
idea about vectors in the mid-1860s. Over the years there were
contributions from Hamilton, Grassmann, Clifford, and others, but
it wasn't until circa 1900 that Gibbs spelled out vector analysis
in more-or-less the form we recognize today.

In all these cases, shoring up the mathematical foundations made
the physics easier and better. I dare you to read Maxwell's
original paper, doing electromagnetism without vectors.

An example that is still alive today, or at least still twitching,
concerns thermodynamics. The field has existed for about 200 years,
but for the first 100 years the math needed to do it cleanly was
not available. On the other hand, the math /has/ been available
for just over 100 years. On the third hand, almost all the current
textbooks are 100+ years out of date. Schroeder has rightly called
it a crime against the laws of mathematics.

Let's be clear: If the thermo book doesn't make sense to you,
that's a good sign. It means you're paying attention. Keep up
the good work. Constructive suggestion: If you redo it using
modern (post-1899) math, thermo is treeeemendously easier and
more powerful. All the useful results are easy to rederive ...
and the results that can't be rederived were just plain wrong
all along.

Similarly, most physics books, especially at the introductory
level, have a very peculiar, archaic approach to probability.
The modern (post-1933) approach is vastly easier and more
powerful.

Also: Trying to do special relativity without Minkowski's
spacetime mathematics would "damage" the physics quite horribly.





The math is unambiguous and uncompromising: a vector has a >direction and a magnitude, but it does not have a location. >Talking about a vector "here" or a vector "over there" does not make sense. >It is a distinction without a difference.

I think this uncompromising definition contradicts others even within
Math itself.

No, it doesn't.

Thus, it contradicts the concept of a vector field based
on the notion of a vector as a function of position. In vector field,
to each point of space is attributed a specific vector "sticking out"
of this point.

That's not how it works. Each location in a vector field has
its own proprietary vector space. Different point, different
space. Within each vector /space/ the vector has magnitude and
direction but not location. The space as a whole has a location
within the field, but that's the answer to a different question.

The idea of a separate vector /space/ at each point in the
vector /field/ is particularly important and obvious when
you consider 2D vectors on the surface of a sphere. Each
vector space is /tangent/ to the sphere. A tangent vector
such as the velocity vector does not live in or on the
surface of the sphere; it lives in the tangent space. See
the second diagram (tangent space and tangent vector) here:
https://en.wikipedia.org/wiki/Tangent_space

Things get weird when you need to compute div, grad, and curl,
since they require subtracting vectors from different vector
spaces. Doing this properly takes a bit of work, but it can
be done.

If a vector does not have a location, then consistency
of the concept of A^B becomes debatable.

I don't see any issues worth debating. The exterior derivative
∇∧B would require a bit of work, but that's the answer to a
different question. The plain old wedge product between two
plain old vectors A∧B is child's play. Indeed the fact that
A and B can be freely relocated makes it /easier/ to construct
the parallelogram representing A∧B.