Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: DATA on collapsing WTC



Ludwik Kowalski and I have discussed through private emails our approaches
to calculating the acceleration of the North Tower of the World Trade
Center (WTC) and determining whether the tower was in free-fall during its
collapse. Ludwik concludes that the tower was not in free-fall with an
acceleration of magnitude of approximately 0.3g - 0.7g. I conclude that the
tower was in free-fall with an acceleration of 0.95g - 1.05g. We are
unable to resolve this discrepancy, but we have agreed to present this
matter to the Phys-L members for discussion and possible resolution.

First let me summarize the facts and issues. (Ludwik, please correct any
misstatements of your positions or findings.) I posted a list of position
(x- and y-coordinates) versus time for the collapse of the North Tower of
the WTC to the Phys-L list. I also posted the data to my website
(www.stchas.edu/faculty/gcarlson/physics/wtc). I compiled this data by
analyzing a CNN video clip of the collapse using DataPoint, a video
analysis program I've written
(www.stchas.edu/faculty/gcarlson/physics/datapoint).

The position v. time data for the collapse spans approximately 4 seconds
and over 60 m of vertical displacement. At a video frame rate of 30
frames/s this amounts to approximately 120 datapoints.

Ludwik's approach and findings--

Taking the data during the collapse in small contiguous sets of 9-15
datapoints, an average time and vertical position is calculated. This
reduced the amount of data from 120 datapoints to 8-13 datapoints. From
this reduced dataset average velocities and average accelerations over each
time interval were calculated using the standard equations: vavg =
delta-x/delta-t and aavg = delta-v/delta-t. The 8-13 average acceleration
values were themselves averaged and that average was reported as the
acceleration of the collapsing tower.

As suspected using this approach, Ludwik found a large variation in the
calculated acceleration, depending on the number of datapoints in a set and
the location of that set of datapoints among all the data. E.g., for one
subset of the data, Ludwik calculated average accelerations of 3.28, 5.59,
2.49, and 5.51 m/s2. When Ludwik shifted the sets of data by one frame, he
calculated average accelerations of 10.7, 8.23, 2.7 and -2.9 m/s2. The
average of the first four values equals 4.2 m/s2; the average of the second
four values equals 4.7 m/s2. These results are consistent with other
attempts, and he concludes that the tower was not in free-fall.

Carlson's approach --

Taking the data during the collapse I calculated the time since the
beginning of the collapse and the magnitude of the tower's
displacement. Using all of the approximately 120 datapoints I calculated a
best power law of the form y=at^n, where y is the magnitude of the
displacement and t is the time since the beginning of the collapse. I
selected the power law as the fitting equation, because we expect the
displacement as a function of time for a body with a constant acceleration
starting from rest and from the origin is y=0.5a*t^2. The coefficient
measures the acceleration; small deviations in the exponent from 2 indicate
the acceleration is nearly constant.

For the entire collapse, I calculate a best fit curve of y=5.1t^1.8
(R^2=0.97). For the first second of the collapse, I calculate a best fit
curve of y = 3.1 t^0.9 (R^2=0.59); I conclude the tower was not in
free-fall for the first second. After t=1s, I calculate a best fit curve
of y=4.7t^1.9 (R^2=0.99); I conclude the tower was in free-fall with an
acceleration of a=2*4.7=9.4 m/s2, which is approximately g=9.8 m/s2. A
graph of these results can be viewed on my website
(www.stchas.edu/faculty/gcarlson/physics/wtc).

Why I think my approach is the better --

Ludwik and I agree that calculating the average velocity and acceleration
between datapoints using the formulae vavg=delta-x/delta-t and
aavg=delta-v/delta-t will result in wildly varying values of
acceleration. Using Ludwik's to analyze all the data, I calculate average
accelerations ranging from -1831 m/s2 to 1835 m/s2 with an average of 2.9
m/s2. However, if I exclude the first calculated aavg from the final
average, a=13.2 m/s2; if I exclude the first and second calculated aavg
from the final average, a=8.1 m/s2. Ludwik tries to mitigate the magnitude
of the variations in average accelerations by grouping the data so that the
delta-t's are larger, but as his results show this is not completely
successful.

I argue the wildly varying values for the acceleration and the sensitivity
of the final average acceleration calculated from those values cast doubt
on the validity of the final result.

Thus, there are two advantages to fitting the data to a power curve:

1) We avoid calculating unrealistic, wildly varying velocities and
accelerations, because the calculation of the best fit power curve does not
depend on delta-t. Thus, we have more confidence in the calculated
acceleration.

2) A power law allows us to directly and quantitatively verify that the
exponent is 2 (i.e., acceleration is constant); instead of assuming the
exponent is 2, as is usually done in undergraduate physics courses; or
making indirect, qualitative judgements (R^2 for best fit velocity v. time
line).

(If, instead of calculating the exponent, I assume the exponent on time is
2 and determine the best fit line of y v. t^2, I calculate the line of
slope 4.1 m/s2, which results in an acceleration of 8.2 m/s2, which
approximately 9.8 m/s2.)

I look forward to comments from the group.

Thanks.

Glenn

------------
Glenn A. Carlson, P.E.
St. Charles Community College
St. Peters, MO USA
www.stchas.edu/faculty/gcarlson/physics
PGP Fingerprint E88D 2AB8 C5A8 D231 06B9 1597 3C72 5CC2 7D87 5519
=======