Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: Measuring the speed of light was - A. Einstein and scien



Regarding where Bill Beaty wrote:
... If the
speed of light is declared to be a constant, then it becomes impossible to
measure. Any attempts to measure it are *really* measuring the length
standard and the time standard.

I certainly agree with the first statement here and also agree with
*half* of the claims in the second one. But I believe such an
ostensible measurement of c is *not* really a measurement of the time
standard--just a measurement of the distance standard. The reason is
that the definition of the time unit, the second, is based on the period
of a particular atomic hyperfine transition of a particular isotope of a
Cesium atom. This definition does *not* refer to the value of c. Thus
any time standard would have been calibrated ultimately to this
c-independent standard. OTOH, the definition of our unit of length *is*
defined according to a particular defined value for c. Thus, if we use a
time standard that is properly calibrated--ultimately to the definition
of the second--then an experiment that ostensibly measures c is really a
measurement of the length of (or equivalently a calibration of) a
secondary distance standard, such as a meterstick or a tape measure, in
meters.

Everything I said in the previous paragraph *presupposes* that the time
standard used in the experiment is properly calibrated to the
definition of the second via an appropriate Cesium atomic clock. If the
time standard has not been reliably calibrated then it *is* possible that
the experiment *does* measure the time-standard calibration rather than
the length of the secondary distance standard--if that secondary
distance standard has been previously calibrated more accurately to the
definition of the meter than the time standard has been calibrated to
the definition of the second. For instance, the 'speed-of-light'
experiment that the students at my college do is actually of this latter
type. In this crude undergraduate experiment a 1-MHz AM-modulated HeNe
laser beam is reflected off of a distant (~35-40 m away) mirror and the
time delay in the return pulse is observed on a 2-channel o-scope that
shows both the long-path reflected signal and the signal from a split-off
part of the beam that did not which did not make the long trip. The
resulting value obtained for c is about 3% off of the accepted value.
Since the calibrated accuracy of the tape measure and rulers used in
measuring the laser path length is more accurate (to the actual defined
meter standard) than the calibration of the time-base of the o-scope is
(to the defined atomic time standard for the second) we legitmately
conclude that the error in the measured value of c relative to the
defined value is due to, essentially, a miscalibration of the scope's
time-base (and maybe also to the imprecision of the student observations
of the waveform shift on the scope display). In this case the experiment
can be thought of as an exercise in calibrating the scope's time-base
because the tape-measured distances are much closer to the true distances
than the 3% error which is typical of the inaccuracy of the time-base
calibration. The scope's sweep rate is determined by an RC discharge
time constant in a multivibrator circuit rather than by a quartz crystal.
(BTW, we can ignore the effect of the index of refraction of air on our
result since that correction is utterly negligible compared to the other
sources of error in the experiment.)

...
Imagine that the speed of light was declared to be exactly 1 in some
units of measurement...

This requires little imagination since c *is* declared as exactly one
light-year per year. This makes the light-year and the meter 'sister'
distance standards as they both are defined the same way each in terms
of a different chosen value for c.

As to the second part of your comment. What would happen if someday we
detected a change in the speed of light based on an accepted definition
of time and an accepted definition of length (of course that length
would have to be something other than the meter)? Which would be wrong
the clock, the measure of length or the accepted speed of light?

If the speed of light has been declared to be constant, then any detected
changes MUST be in the clock or the length-standard.

Assuming the clock is a properly operating Cesium atomic clock, it means
that the measured length of the secondary length-standard *in meters* has
changed. Such a measured change in the secondary length-standard in
meters could either be due to an actual change in the length of a meter
because of a change in the speed at which EM waves propagate in a vacuum,
or be due to a change in the length of the meter that happened because of
a change in the duration of the second because of a change in the way the
spins of Cesium atoms resonate, or be due to a change in the length of
the secondary length-standard because of some sort of an 'aging' effect
or another on that secondary standard, *or* be some combination of all of
these possibilities.

I don't know if this
is such a good idea. But then, what would happen if the speed of light
were to change slightly? Might this make matter (and meter bars) change
size?

This *is* somewhat of a possibility. However, recall that the ordinary
matter in a terrestrial bar tends of be made of non-relativistic
particles. The main length scale that sets the atomic sizes and
interatomic distances in the bar is the Bohr radius which depends on the
electron's mass & charge and on Planck's constant--but not on c. This is
not to say that higher order effects would not change the length of the
bar in some weaker way. For example, since the core electrons in the
atoms of any heavy metal elements in the meter-bar tend to be more
significantly relativistic than the outer valence electrons it is
possible that a change in c would modify the core orbital structure
enough to affect the screening seen by the outer electrons in the atoms
enough to change the overall atomic sizes and interatomic interaction
forces enough to change the mean distance between the atoms of the
meter-bar. However I expect that the proportional change in the actual
length of the meter-bar would be less than the proportional change in
the actual length of the defined meter caused by the change in the
actual speed of light itself.