Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] Re: Pre/post testing



If you object to cross-posting as a way to tunnel through
interdisciplinary barriers, or have zero or less interest in pre/post
testing please hit "DELETE." And if you respond to this long (12 kB)
post, please don't hit the reply button unless you prune the original
message normally contained in your reply down to a few lines,
otherwise you may inflict this entire post yet again on suffering
list subscribers.

I shall *not* apologize for again jumping on my overused soapbox, but
Raymond Rodrigues' (2005) provocative ASSESS post of 1 Feb 2005
titled "Re: Pre/post testing" leaves me no alternative. The blame
rests squarely with Rodrigues!

Rodrigues wrote [bracketed by lines "RRRRRRRRRR. . . .":

RRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRR
I liken this conversation . . .[currently 15 posts on the ASSESS
thread "Pre/post testing," accessible on the Jan./Feb. archives
<http://lsv.uky.edu/archives/assess.html>]. . . to a conversation
about global warming. Can we determine that there has been a change
in the temperature of the earth as a result of a pre-measurement and
a post-measurement? Yes.

Does that tell us why and what we should do as a result? Partially,
and even then various hypotheses need to be followed through further
research. But a pre-/post-test design does not address all the
variables that might cause this warming. How many factors
contribute? How much is cyclic? What remediations or changes in our
behaviors would improve the situation?
RRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRR

Evidently, Rodrigues is either dismissive or oblivious of my post
"Questions Active Learning" [Hake (2005)]. Therein I wrote (see that
post for references other than Hake (1998a,b; 2002a,b):

HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH
. . . the research question (see e.g., Hake 1998a) [should be]
**"Can the use of non-traditional teaching methods increase the
effectiveness of . . . courses in promoting student understanding
well beyond that attained by traditional methods?"**

That the use of "interactive engagement" (IE) methods *can* increase
the effectiveness of introductory physics courses in promoting
student understanding well beyond that attained by "traditional" (T)
methods has been rather convincingly demonstrated by physics
education researchers . . . . They have shown [for reviews see, e.g.,
Hake (1998a,b; 2002a,b)] that, unknown to most of academia, IE
methods yield normalized pre/post test gains in conceptual
understanding about two-standard deviations above those produced by T
methods. A solution to Bloom's (1986) famous "two sigma
problem,"appears to be at hand, at least for introductory physics
instruction.
HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

For the definition of "normalized gain", and operational definitions
of IE and T methods see Hake (2005).

Rodrigues then makes three other points:

1111111111111111111111111111111111
1. "Student learning does not occur solely in a class or a set of
classes. It occurs in residence halls, student activities, and
off-campus experiences. It results from previous experiences and
learning."

That any substantive increase in understanding of Newtonian mechanics occurs
"in residence halls, student activities, and off-campus experiences"
is contradicted by the very low average normalized gains <g> of about
0.2 for students subjected to traditional introductory physics
classes. I suspect that substantive learning in casual non-class
activities is also rare for other subjects that make higher-order
cognitive demands.

As for learning that results from "previous experiences and
learning," that is precisely what the pretest is meant to determine.


22222222222222222222222222222222222
2. "In addition, some disciplines do not lend themselves to easily
quantifiable measures."

The same claim has been made for astronomy, economics, biology,
chemistry, computer science, engineering, and physics; but pre/post
testing is now occurring in all those disciplines [Hake (2004a,b)].
IMHO, any discipline in which introductory courses are justified, at
least partially, on the grounds that they should increase student
understanding of important concepts should consider:

a. the development of valid conceptual tests, designed to be
consistently reliable, and (if possible) to be given to thousands of
students in hundreds of courses; and

b. use of those test for the pre/post assessment of the results of,
and the need for, reform teaching methods and curricula.


33333333333333333333333333333333333
3. "Does that mean that pre/post testing is of no use? No, but it
does mean that it should not be the sole source of data, only one
piece of information, and sometimes a piece that needs to be
discarded."

No one that I know of has ever argued that pre/post testing should be
the sole source of information on course effectiveness. It's obvious
that there are many objectives of an introductory course that are NOT
directly measured by pre/post testing. In Hake (1998b, 2002b) I
listed 10 such objectives. It should also be realized that extensive
qualitative and quantitative research by disciplinary experts is
required to develop valid and consistently reliable multiple-choice
tests that can be given to thousands of students in hundreds of
courses. A model for such test development is provided by the
landmark work of Halloun & Hestenes (1985a,b), ignored throughout
most of academia, and even [until recently - Donovan & Pellegrino
(2003)] by the National Research Council.

As for discarding the results of pre/post testing, IF the test is
valid, consistently reliable, and field tested nationwide in hundreds
of courses, then why would any serious assessor discard the results?

Richard Hake, Emeritus Professor of Physics, Indiana University
24245 Hatteras Street, Woodland Hills, CA 91367
<rrhake@earthlink.net>
<http://www.physics.indiana.edu/~hake>
<http://www.physics.indiana.edu/~sdi>

"What we assess is what we value. We get what we assess, and if we
don't assess it, we won't get it."
Lauren Resnick (quoted by Grant Wiggins
<http://www.maa.org/saum/articles/wiggins_appendix.html>


REFERENCES
Donovan, M.S. & J. Pellegrino, eds. 2003. "Learning and Instruction:
A SERP Research Agenda," National Academies Press; online at
<http://books.nap.edu/catalog/10858.html>.

Hake, R.R. 1998a. "Interactive-engagement vs traditional methods: A
six-thousand-student survey of mechanics test data for introductory
physics courses," Am. J. Phys. 66: 64-74; online as ref. 24 at
<http://www.physics.indiana.edu/~hake>, or download directly by clicking on
<http://www.physics.indiana.edu/~sdi/ajpv3i.pdf> (84 kB). Recently,
normalized gain differences between Traditional and Interactive
engagement courses that are consistent with the work of Hake
(1998a,b) have been reported by many other physics education research
groups as referenced in Hake (2002a,b).

Hake, R.R. 1998b. "Interactive-engagement methods in introductory
mechanics courses," online as ref. 25 at
<http://www.physics.indiana.edu/~hake> or download directly by clicking on
<http://www.physics.indiana.edu/~sdi/IEM-2b.pdf> (108 kB). A crucial
companion paper to Hake (1998a): average pre/post test scores,
standard deviations, instructional methods, materials used,
institutions, and instructors for each of the survey courses of Hake
(1998a) are tabulated and referenced.

Hake, R.R. 2002a. "Lessons from the physics education reform effort,"
Ecology and Society 5(2): 28; online at
<http://www.ecologyandsociety.org/vol5/iss2/art28/>. Ecology and Society
(formerly Conservation Ecology) is a free online "peer-reviewed
journal of integrative science and fundamental policy research" with
about 11,000 subscribers in about 108 countries.

Hake, R.R. 2002b. "Assessment of Physics Teaching Methods,
Proceedings of the UNESCO-ASPEN Workshop on Active Learning in
Physics, Univ. of Peradeniya, Sri Lanka, 2-4 Dec. 2002; also online
as ref. 29 at
<http://www.physics.indiana.edu/~hake/>, or download directly by clicking on
<http://www.physics.indiana.edu/~hake/Hake-SriLanka-Assessb.pdf> (84 kB).

Hake, R.R. 2004a. "Re: Measuring Content Knowledge," online at
<http://lists.asu.edu/cgi-bin/wa?A2=ind0403&L=aera-d&T=0&O=D&P=5436>.
Post of 14 Mar 2004 16:29:47-0800 to ASSESS, Biopi-L, Chemed-L,
EvalTalk, Physhare, Phys-L, PhysLnrR, POD, and STLHE-L.

Hake, R.R. 2004b. "Re: Measuring Content Knowledge," online at
<http://listserv.nd.edu/cgi-bin/wa?A2=ind0403&L=pod&O=A&P=17167>.
Post of 15 Mar 2004 14:29:59-0800 to ASSESS, EvalTalk, Phys-L,
PhysLrnR, & POD.

Hake, R.R. 2005. "Questions Active Learning," online at
<http://listserv.nd.edu/cgi-bin/wa?A2=ind0501&L=pod&O=D&P=27821>.
Post of 29 Jan 2005 16:05:11-0800 to AERA-C, AERA-D, AERA-J, AERA-K,
AERA-L, ASSESS, EvalTalk, Math-Learn, PhysL, & PhysLrnR.

Halloun, I. & D. Hestenes. 1985a. "The initial knowledge state of
college physics students." Am. J. Phys. 53:1043-1055; online at
<http://modeling.asu.edu/R&E/Research.html>. Contains the "Mechanics
Diagnostic" test, precursor to the widely used "Force Concept
Inventory."

Halloun, I. & D. Hestenes. 1985b. "Common sense concepts about
motion." Am. J. Phys. 53:1056-1065; online at
<http://modeling.asu.edu/R&E/Research.html>.

Rodrigues. R.J. "Re: Pre/post Testing," ASSESS post of 1 Feb 2005
10:24:52 -0500; online at
<http://lsv.uky.edu/cgi-bin/wa.exe?A2=ind0502&L=assess&T=0&F=&S=&P=259>.