Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: What Does the FCI Tell Us?



Please excuse this cross-posting to discussion lists with archives at:

Phys-L <http://mailgate.nau.edu/archives/phys-l.html>,
PhysLrnR <http://listserv.boisestate.edu/archives/physlrnr.html>,
Physhare <http://lists.psu.edu/archives/physhare.html>.
AP Physics discussion list
<http://www.collegeboard.org/ap/listserv/tech.html>
(no easily searchable archives)

In his 5/5/01 Physhare post "Re: What Does the FCI Tell Us?" Donald
Simanek (DS) has responded to my 5/1/01 and 5/2/01
Phys-L/PhysLrnR/Physhare/AP Physics posts of the same title, which
were, in turn responses to his 4/25/01 Physhare post "Re: AP Physics
Students." DS's Comments (DS), followed by my responses (H), are
given below:

11111111111111111111111111111111111111111
DS1. "Even if these studies . . .(ref. 1-3). . . do indicate
statistically that certain approaches do better on the selective
testing methods used, I am no fan of *any* of the evaluation
instruments (tests) amenable to statistical tallying--the more
conventional ones or the newer ones specifically designed to show
that the innovative approach the investigator is using is obviously
superior to the old methods."

H1. What "conventional tests" and what "newer tests" is DS referring
to? If he regards the Force Concept Inventory (FCI) as one of "the
newer tests" then he evidently believes that the FCI was
"specifically designed to show that the innovative approach the
investigator is using is obviously superior to the old methods." The
history of the FCI(4) and its precursor the Mechanics Diagnostic
test(5) does not support DS's dismissive assertion.

2222222222222222222222222222222222222222222
DS2. " . . . .the word MARGINALLY . . . .(in DS's 4/25/01 post). . .
meant that the fci instrument alone enlightens us only marginally
about how to effect improvement in physics instruction across the
board. The improvement in scores under a certain instructional system
may be better than marginal, but our enlightenment as to why this
happens still seems to me only marginal."

H2. I disagree. The last sentence of the abstract of ref. 1 reads:

"The conceptual and problem-solving test . . .(Mechanics Baseline
(ref. 6). . . results strongly suggest that the classroom use of
Interactive Engagement (IE) methods can increase mechanics-course
effectiveness well beyond that obtained in traditional practice."

In ref. 1, "IE methods," "IE courses," and "traditional courses" are
OPERATIONALLY defined as:

"a. Interactive Engagement (IE) methods as those designed at least in
part to promote conceptual understanding through interactive
engagement of students in heads-on (always) and hands-on (usually)
activities which yield immediate feedback through discussion with
peers and/or instructors, all as judged by their literature
descriptions;

b. IE courses as those reported by instructors to make substantial
use of IE methods;

c. Traditional (T) courses as those reported by instructors to make
little or no use of IE methods, relying primarily on passive-student
lectures, recipe labs, and algorithmic-problem exams."

As for effecting "improvement in physics instruction across the
board" see H4 below.

333333333333333333333333333333333333333333333333333

DS3. "My reservation is that even if certain instructional
approaches really are superior for the subject matter within the very
limited scope of the fci, this fact still doesn't tell us *why* these
approaches are better."

H3. Again, I disagree for the reason indicated in H2 above.


4444444444444444444444444444444444444444444444444444
DS4. "And it doesn't tell us how to extend the idea to other subject
matter and whether we can expect this success to extrapolate to other
subject matter,
say, for example, chemistry or biology, astronomy, nuclear physics,
relativity, or quantum mechanics. Especially it doesn't help me to
imagine how better to teach a subject where the concepts are not
directly accessible to the senses, and only indirectly related to
simple experiments one can perform. Have we any notion how to devise
an 'energy concept inventory' or a 'photon concept inventory' or a
'field concept inventory'? I don't think so."

H4. Again, I disagree for the reason indicated in H2 above. As
indicated in Lesson #1 of ref. 3:

"L1. The use of IE strategies can increase the effectiveness of
conceptually difficult courses well beyond that obtained with
traditional methods.
Education research in biology (Hake 1999b,c), chemistry (Herron &
Nurrenbern 1999), and engineering (Felder et al. 2000a,b), although
neither as extensive nor as systematic as that in physics (McDermott
& Redish 1999, Redish 1999), is consistent with the latter in
suggesting that in conceptually difficult areas, interactive
engagement methods are more effective than traditional
passive-student methods in enhancing students' understanding. I see
no reason to doubt that such is not also the case in other science
and even non-science areas."

As regards instruction in quantum mechanics, check out "A New Model
Course in Applied Quantum Physics," E.F. Redish, R.N. Steinberg, M.C.
Wittmann; online at
<http://www.physics.umd.edu/perg/qm/qmcourse/welcome.htm>.
Information on understanding how students learn quantum mechanics is
included.

The construction of an "Energy Concept Inventory" is now underway in
at least one physics education research group.

555555555555555555555555555555555555555555555
DS5. "And then there's the nagging question which faces any new approach. Even
if it works well with teachers who are sufficiently motivated to undertake
training in this new approach, and enthusiastic enough about it to use it
in their classes, and in school systems with institutional support of
the approach, can (and will) it be implemented successfully on a larger
scale with a larger population of teachers?"

H5. Data showing the relative success (as measured by normalized
gains on the FCI) of some IE methods (vis., University of Washington
"Tutorials" and Dickinson College "Workshop Physics") on "a larger
scale with a larger population of teachers" outside the originating
institution is reported in refs. 7 & 8.

Richard Hake, Emeritus Professor of Physics, Indiana University
24245 Hatteras Street, Woodland Hills, CA 91367
<rrhake@earthlink.net>
<http://www.physics.indiana.edu/~hake>

REFERENCES
1. R.R. Hake, "Interactive-engagement vs traditional methods: A
six-thousand-student survey of mechanics test data for introductory
physics courses," Am. J. Phys. 66, 64-74 (1998); on the Web at
<http://www.physics.indiana.edu/~sdi/>.

2. R.R. Hake, "Interactive-engagement methods in introductory
mechanics courses," on the Web at
<http://www.physics.indiana.edu/~sdi/> and
submitted on 6/19/98 to the "Physics Education Research Supplement to
AJP"(PERS).\

3. R.R. Hake, "Lessons from the Physics Education Reform Effort,"
submitted on 3/28/01 to "Conservation Ecology"
<http://www.consecol.org/Journal/>, a "peer-reviewed journal of
integrative science and fundamental policy research." On the web as
ref. 10 at <http://www.physics.indiana.edu/~hake>
[ConEc-Hake-O32601a.pdf, 3/26/01, 172K) (179 references, 98
hot-linked URL's). 164K). Gives references to articles by physics
education research groups whose FCI normalized gain results for
interactive-engagement and traditional courses are consistent with
those of (refs. 1 & 2). These groups are at: Univ. of Maryland
(Redish et al. 1997, Saul 1998, Redish & Steinberg 1999, Redish
1999); Univ. of Montana (Francis et al. 1998); Rennselaer and Tufts
(Cummings et al. 1999); North Carolina State Univ. (Beichner et al.
1999); and Hogskolan Dalarna -Sweden (Bernhard 1999). The consistency
of pre/post test results calls into serious question the common dour
appraisals of pre/post test designs [see, e.g., Cook &
Campbell(1979), Cronbach & Furby 1970)].

4. (a) D. Hestenes, M. Wells, and G. Swackhamer, "Force Concept
Inventory," Phys. Teach. 30, 141-158 (1992). (b) I. Halloun, R.R.
Hake, E.P. Mosca, and D. Hestenes, Force Concept Inventory (Revised,
1995); password protected at
<http://modeling.la.asu.edu/modeling.html>.

5. (a) I. Halloun and D. Hestenes, "The initial knowledge state of
college physics students," Am. J. Phys. 53, 1043-1055 (1985); (b)
"Common sense concepts about motion," ibid. 53, 1056-1065 (1985).

6. D. Hestenes and M. Wells, "A Mechanics Baseline Test," Phys.
Teach. 30, 159-166 (1992); password protected at
<http://modeling.la.asu.edu/modeling.html>.

7. E.F. Redish & R.N. Steinberg, "Teaching physics: figuring out
what works." Phys. Today 52(1), 24-30 (1999); online at
<http://www.physics.umd.edu/rgroups/ripe/perg/cpt.html>.

8. J.M. Saul, 1998, "Beyond problem solving: evaluating introductory
physics courses through the hidden curriculum," Ph.D. thesis, Univ.
of Maryland. 1998.