Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Physltest] [Phys-L] Will the NCLB Tend to Propagate California's Direct Science Instruction Throughout the Entire Nation? - PART 1



PART 1

If you object to cross-posting as a way to tunnel through
interdisciplinary barriers, please hit "delete" now. And if you
respond to this long (26kB) post, please don't hit the reply button
unless you prune the original message normally contained in your
reply down to a few lines, otherwise you may inflict this entire post
yet again on suffering list subscribers.

The No Child Left Behind (NCLB) Act requires testing in science
starting in 2007.

WILL THE NCLB TEND TO PROPAGATE CALIFORNIA'S DIRECT SCIENCE
INSTRUCTION [Hake 2004j)] THROUGHOUT THE ENTIRE NATION?

Here are 5 reasons "A-E" why it might, and two reason "F-G" why it might not.

I. IT MIGHT:

A. It's easier to test for rote memorized material implanted by
Direct Instruction than for conceptual understanding of science and
its methods induced by guided inquiry methods.

B. The U.S. Department of Education (USDE) appears bereft of advisors
from the physical sciences. For example, as far as I am aware, no
physical scientists are members of the USDE's:

1. Advisory board
<http://www.excelgov.org/displayContent.asp?Keyword=prppcAdvisory>
for the "Coalition for Evidence-Based Policy"
<http://www.excelgov.org/displayContent.asp?Keyword=prppcHomePage>.

2. Technical Advisory Committee
<http://www.w-w-c.org/whatwedo/factsheet.pdf> (68kB) for the What
Works Clearing house <http://www.w-w-c.org/>.

C. Douglas Carnine, a prominent advocate of Direct Instruction [see.
e.g., Carnine (2000)] is a member of the above Technical Advisory
Group. Carnine played a leading role in undermining effective math
instruction in California [see, e.g. Schoenfeld (2003)] and, I
suspect, is now poised to attempt the same on a national scale for
the 3 R's and for science instruction.

D. The U.S. Department of Education's (USDE's) national science "summit"
<http://www.ed.gov/news/pressreleases/2004/05/05042004a.html> in 2004
showcased a report by cognitive scientists Klahr & Nigam (2004) that
is erroneously heralded by Direct Instruction (DI) zealots [e.g.
Mathematically Correct's Science Corner
<http://mathematicallycorrect.com/science.htm>, Wayne Bishop (2004)]
as demonstrating that DI is a GENERALLY effective method of science
instruction [for the counter see Hake (2004j)]. For discussions of
Klahr & Nigam (2004) see e.g., Adelson (2004); Cavenaugh (2004a,b);
Begley (2004a,b); Tweed (2004a,b); Hake (2004g,h,i)]

E. Sharon Begley (2004b) quotes Grover Whitehurst (director of the
U.S. Education Department's Institute of Education Sciences
<http://www.ed.gov/about/offices/list/ies/index.html?exp=0>): "IN
SCIENCE EDUCATION, THERE IS ALMOST NOTHING OF PROVEN EFFICACY."


II. IT MIGHT NOT:

F. But contrary to psychologist Whitehurst's claim, in Hake (2004j) I
wrote: [bracketed by lines "HHHHHHHHHHH. . . ."; see that article for
the references other than Shavelson & Towne (2000), Burkhardt &
Schoenfeld (2003), Doss-Hammel (2004), and Lowery (2003)]

HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH
There is a substantial amount of scientific research evidence [for
discussions of what constitutes "scientific research evidence" in
education see Shavelson & Towne (2000) & Burkhardt & Schoenfeld
(2003)] that "hands-on guided-inquiry methods" [commonly called
"inquiry" or "interactive engagement" methods] are far more effective
than "direct instruction" for promoting student learning IN
CONCEPTUALLY DIFFICULT AREAS [for reviews see e.g., Hake (2004j);
Doss-Hammel (2004); Lowery (2003); and the literature references in
AAAS (1993, 2004), NRC (1996; 1997a,b; 1999, 2000, 2001, 2003),
Bransford et al. (1999), and Donovan et al. (1999).

[The California Curriculum Committee (CCC)] appears to inhabit a
"private universe" [Schneps & Sadler (1985)], seemingly oblivious of
the literature of cognitive science [see, e.g. Bransford et al.
(1999)] and three decades of science-education research showing the
superiority of hands- and minds-on pedagogy to direct instruction in
conceptually difficult areas [see e.g., Karplus (1974, 1977, 1981);
Arons (1960, 1972, 1974, 1983, 1985, 1997, 1998); Shymansky et. al.
(1983, 1989, 1990); Halloun & Hestenes (1985a,b); McDermott & Redish
(1999); Hake (1998a,b; 2002a,b); Lopez & Schultz (2001); FOSS (2001);
Pelligrino et al. (2001); Crouch & Mazur (2001); Fagen et al. (2002);
Fuller (2002)]; Redish (2003); and Belcher (2003).

NOTE THAT NONE OF THE ABOVE RESEARCH CONCERNS UNGUIDED "DISCOVERY
LEARNING," an evident bugaboo of CCC's Stan Metzenberg and executive
director Thomas Adams (2004). . .[and more recently Klahr & Nigam
(2004)]. . . . Still other references showing the superior
effectiveness of hands-on guided inquiry methods over direct
instruction are Bredderman (1982, 1983, 1985), Kyle et al. (1988),
Jorgenson & Vanosdall (2002), GLEF (2001), and Anderson (2002).

In addition, the eleven K-12 science-education studies listed in
Table 1 of [the meta-meta analysis of] Lipsey & Wilson (1993) (where
the test group is characterized by reform methods) yield a total N =
888 students and average effect size <d> = 0.36 [Cohen (1988)]. Most
of these studies include grades 4 or 6 to 12 with the effect size
control group being traditional direct instruction and the
measurement unit being "achievement" or "learning" (presumably as
measured by tests). Cohen's rule of thumb - based on typical results
in social science research - that d = 0.2, 0.5, 0.8 imply
respectively "small," "medium," and "large" effects, but Cohen
cautions that the adjectives "are relative, not only to each other,
but to the area of behavioral science or even more particularly to
the specific content and research method being employed in any given
investigation."

My own survey [Hake (1998a,b)] . . . [that study concerns only
advanced secondary and undergraduate courses] . . . yielded a much
larger effect size of d = 2.43 [Hake (2002a,b)] and such large
differences in the effectiveness of interactive engagement vs direct
instruction have been corroborated by many other physics education
researchers as discussed in Hake (2002a,b). Eight reasons for the
unusually large d are given in Hake (2004k).
HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

The last paragraph concerns pre/post testing using reasonably well-matched
control groups (the traditional courses). Such research does not
meet the USDE's "gold standard" of randomized control trials, but [as
argued in Hake (2004k, Section IV)] would nevertheless probably pass
muster at the USDE's "What Works Clearing House"
<http://www.w-w-c.org/> as "quasi-experimental studies of especially
strong design" [see
<http://www.w-w-c.org/reports/standards.html>]. Despite rampant
pre/post paranoia [Hake (2000, 2004l)], pre/post assessments of
student learning are being more and more utilized in fields such as
astronomy, economics, biology, chemistry, computer science, and
engineering [see Hake (2004m)].

G. In his Education Week report "NRC to Explore Teaching and Testing
in Science" Sean Cavanagh (2004b) discusses three studies in progress
at the National Research Council (NRC), aimed at exploring how
students learn science most effectively, and how it is best taught
and tested. I hope that the NRC's expert science-education committees
can switch the USDE's Direct Instruction juggernaut onto the guided
inquiry (NOT discovery) track advised in NRC (1995, 2000, 2003].
Prospects for such redirection seem at least possible now that the
NRC's expert science education committees have FINALLY come to their
senses [Donovan & Pellegrino (2003)] and started to appreciate the
most convincing evidence for the superiority of guided inquiry
methods in promoting students' conceptual understanding:
pre/post-testing research BY DISCIPLINARY EXPERTS using valid and
consistently reliable research-based tests [see e.g. Halloun &
Hestenes 1985a,b)] and reasonably well-matched control groups.


Richard Hake, Emeritus Professor of Physics, Indiana University
24245 Hatteras Street, Woodland Hills, CA 91367
<rrhake@earthlink.net>
<http://www.physics.indiana.edu/~hake>
<http://www.physics.indiana.edu/~sdi>

REFERENCES are in PART 2
_______________________________________________
Phys-L mailing list
Phys-L@electron.physics.buffalo.edu
https://www.physics.buffalo.edu/mailman/listinfo/phys-l