Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] Re: Definition of electric field



Savinainen Antti wrote:

1) Is it possible to define electric field with no
reference to forces? Is it so that one can not discuss
fields without referring to forces? (I know that the
electric field can be calculated from the potential
in cases where potential can be defined. This does not
seem to involve forces, or perhaps it does through
the concept of work!).

I'm not sure I can fully answer that, so let me offer
a possibly-helpful partial answer. I would say that
the question is
*) tricky,
*) from a pedagogical point of view, somewhat
important, since students really like to have
everything explicitly defined,
*) from a purely technical point of view, not
nearly as important as it might seem, for the
following reason: At the end of the day, the
electromagnetic field is what it is; it does
what it does. A caveman can build a fire that
radiates electromagnetic fields, without needing
to "define" the field. Definitions have to do
with how _we_ think about the field, whereas the
field exists whether we think about it or not.

If you ask for a _description_ of the field, I can
answer easily and comfortably. If you demand a
_definition_, I feel I have to be careful, like
walking on eggshells.

If we must define the field, it would make a certain
amount of sense to say it is the thing that obeys the
Maxwell equation.
A) On the first outing, you can start by studying
the fields themselves, in the absence of charges.
You can go fairly far down this road without even
mentioning charges, let alone the force on the charges ...
obtaining useful and interesting solutiongs that describe
radiation.

B) On the second outing, you can discuss charges and
currents as source-terms for the aforementioned
fields. But you can treat these as boundary conditions,
stipulating fixed charges and fixed currents, so that
once again forces are not part of the description.

C) On the third outing, to get a really complete
description of the fields, we ought to metion the
Lorentz force law.

So, depending on your philosophy, you could take items
(A) and (B) to be the "definition" of the fields. Or
maybe you could insist that (C) be part of the definition
also. To paraphrase Bill Clinton, it depends on how you
define "define".

To repeat: It makes a certain amount of sense to say
that the charge plays two roles:
B) The charge acts on the field, creating the field.
C) The field acts on the charge, creating a force.

I don't see any good physics reason why (C) should be
_the_ defining property of the field. Yes, I know a lot
of high-school-level books start with (C) ... but a lot
of more-advanced books start with (A) and (B).

So I guess it comes down to a question of pedagogy, which
in turn requires adapting the approach to suit the
audience. When people talk about "definitions" I usually
suspect they are striving for a precise and formal
approach ... probably not suitable for an intro-level
audience. So if you are asking for a _definition_
suitable for an intro-level class, I'm skeptical that
any such thing exists.

We define the electic field by making use of the
positive test charge which is small enough not to
distort the original charge distribution of the
field that is measured.

Perhaps, but not necessarily; see below.

Then we continue using
this defined electric field to determine force on
any charge placed in the field. Now, there is no
guarantee that this new "test" charge would not
distort the original charge distribution and hence
change the defined electric field strength!
Nevertheless, many textbook exercises are presented
with no discussion of this possibility.

Again this is tricky. Whether or not the size of the
test charge is relevant depends on rather technical
details of how the scenario is set up.
*) If you specify the _field_ (perhaps by saying
that in the absence of the test charge it would
have been uniform) then the size of the test
charge doesn't matter, because the field of the TC
doesn't act on the TC itself (by symmetry, if nothing
else) and the field equations are linear. Linearity
implies superposition. That is, you superpose the
self-force term (which is zero) with the force due
to the rest of the field (which is what you were
interested in all along) and it just works out. I
recommend this approach.
*) At the other extreme, if you specify the scenario
in terms of chunks of metal that partially constrain
a bunch of movable charges, then the physics is
grossly nonlinear. You can't make any sense out
of it unless you take the limit of a very small
test charge ... and even that is somewhat unphysical,
because test charges do not exist in nature, for two
reasons: (a) charge is quantized, and (b) charge is
always carried by particles that are subject to
quantum-mechanical identical-particle effects.
(Asking what is the potential of an electron in a
metal is very different from asking what is the
potential of a muon at the same location, even
though they have the same charge.) I don't think
there is any way to make this approach be fully
logical.

2) So why is it so important to use "small" test charge
in defining the field if it does not matter in textbook
applications later on? I understand that the electric
field strenght is the property of the field, not the
property of a charge placed in the field. However,
how realistically this is achieved in textbook
presentations and in real measurements?

Real measurements indicate that the electric field is
linear to a mind-boggling level of accuracy ... many
many orders of magnitude. See the introduction to
Jackson for a summary.

So if you specify the _field_ (which is how I like to
think about it), the size of the "test" charge doesn't
matter.

Interestingly, as my colleague pointed out, Wikipedia
defines electric field as
(<http://en.wikipedia.org/wiki/Electric_field>):

“Suppose one of the charges is taken to be fixed,
and the other one to be a moveable "test charge". […]
The electric field is defined as the proportionality constant between charge and force: F = qE”.

This definition does not require that the test charge should be small; it is unnecessary because the other
charge creating the field is assumed to be "fixed".

3) Do you think that Wikipedia's definition is better
that the standard definition?

Yes, it is better.

How can we make the other charge "fixed" in practise?

Actually, "fixed" isn't the key idea. "Known" is perhaps a
better word. And even then, it can be known in a somewhat
abstract sense: call it E and then solve for a self-consistent
value of E, consistent with the specified test charge and
the specified boundary conditions.

Conceptually, this is no more difficult than the optimization
problems that are assigned in the calculus-1 class. Maybe
even less difficult.

If you want to see what this looks like in practice, check
out
http://www.av8n.com/physics/laplace.html
_______________________________________________
Phys-L mailing list
Phys-L@electron.physics.buffalo.edu
https://www.physics.buffalo.edu/mailman/listinfo/phys-l