Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
1) Is it possible to define electric field with no
reference to forces? Is it so that one can not discuss
fields without referring to forces? (I know that the
electric field can be calculated from the potential
in cases where potential can be defined. This does not
seem to involve forces, or perhaps it does through
the concept of work!).
We define the electic field by making use of the
positive test charge which is small enough not to
distort the original charge distribution of the
field that is measured.
Then we continue using
this defined electric field to determine force on
any charge placed in the field. Now, there is no
guarantee that this new "test" charge would not
distort the original charge distribution and hence
change the defined electric field strength!
Nevertheless, many textbook exercises are presented
with no discussion of this possibility.
2) So why is it so important to use "small" test charge
in defining the field if it does not matter in textbook
applications later on? I understand that the electric
field strenght is the property of the field, not the
property of a charge placed in the field. However,
how realistically this is achieved in textbook
presentations and in real measurements?
Interestingly, as my colleague pointed out, Wikipedia
defines electric field as
(<http://en.wikipedia.org/wiki/Electric_field>):
Suppose one of the charges is taken to be fixed,
and the other one to be a moveable "test charge". [ ]
The electric field is defined as the proportionality constant between charge and force: F = qE.
This definition does not require that the test charge should be small; it is unnecessary because the other
charge creating the field is assumed to be "fixed".
3) Do you think that Wikipedia's definition is better
that the standard definition?
How can we make the other charge "fixed" in practise?