Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] Re: Research on Student Response Systems



I'd like to address the situation raised by Rick and Joe. The issue of
finding research funding is a very real difficulty. Back in the
mid-80's when I was developing the first video-based lab software
(called VideoGraph, for those of you with good memories), I had to hide
my research in an instructional materials development grant. I was
actually interested in taking advantage of the human eye's ability to
perceive motion (which mostly happens in the retina, not the brain) and
somehow using that for teaching physics. Since kinematics is all about
motion, that seemed like the topic to utilize. The reviewers of my
grant proposal complained that it "seemed too much like a research
project." It was, of course, but luckily I was eventually able to pass
it off as a project to create instructional materials which happened to
have a VERY strong evaluation component. I know most other PER-based
curriculum developers have had to jump through similar hoops. It's
getting better, but we aren't there yet. Luckily, folks like Duncan
McBride have been very sympathetic to the whole PER enterprise and are
demanding thorough assessment of learning when materials are developed
with NSF funds. This automatically results in "the testing being done
by the inventors/advocates of the method," mentioned by Rick.

If someone knows of a way to get an INDEPENDENT testing center going,
I'd be happy to listen. I've tried three or four different times to
get funding for a center that would develop a set of carefully crafted
and evaluated conceptual understanding assessment instruments. Those
that know me realize there had to be an acronym. It was called "MAP:
Multiple Assessments Project" and the goal was to "map out" what
students knew about various physics topics. No luck so far, although
an undergraduate and I are updating TUG-K(1) and I'm helping another
student work on a geometric optics instrument. Back in 1995 a group of
us asked(2) NSF's Physics Division to consider educational research
proposals. As far as I know, only one(3) proposal has been funded so
far.

By the way, two well-known "meta-studies" done by non-developers that
immediately come to mind are Dick Hake's 6000 student comparison(4) of
passive and active engagement methods and Jeff Saul's dissertation(5),
where he looked at secondary implementations of Dickinson's Workshop
Physics, the Washington Tutorials, and Minnesota's Group Problem
Solving and Problem Solving Labs.

Bob Beichner
NCSU Physics

1. Beichner, R. J. (1994). "Testing student interpretation of
kinematics graphs." American Journal of Physics 62(8): 750-762.
Available at <http://www.ncsu.edu/per/Articles/TUGKArticle.pdf>,
accessed 2/12/05.

2. Beichner, R., Hake, R., McDermott, L., Mestre, J., Redish, E., Reif,
F., and Risley, J. (1995). Support of Physics Education Research as a
Subfield of Physics: Proposal to the NSF Physics Division. Available at
<http://www.ncsu.edu/per/Articles/NSF%20White%20Paper.pdf>, accessed
2/12/05.

3. Meltzer, D. and Thompson, J. (2004). "Collaborative Research:
Research on the Learning and Teaching of Thermal Physics." Available at
<https://www.fastlane.nsf.gov/servlet/showaward?award=0406724>,
accessed 2/12/05.

4. Hake, R. (1998). "Interactive-engagement versus traditional methods:
A six thousand-student survey of mechanics test data for introductory
physics courses." American Journal of Physics 66(1): 64-74. Available
at <http://www.physics.indiana.edu/~sdi/ajpv3i.pdf>, accessed 2/12/05.

5. Saul, J. M. (1998). Beyond problem solving: Evaluating introductory
physics courses through the hidden curriculum. College Park, MD,
University of Maryland: 596 pages. Available at
<http://www.physics.umd.edu/rgroups/ripe/perg/dissertations/Saul/>,
accessed 2/12/05.



Date: Fri, 11 Feb 2005 10:14:53 -0500
From: Rick Tarara <rtarara@SAINTMARYS.EDU>
Subject: Re: Research on Student Response Systems

The problem here is one that faces almost all 'research' on educational
methods--and was alluded to in an earlier post. There are no
INDEPENDENT
testing centers for any of these methods. The testing is done by the
inventors/advocates of the method. As such they bring an enthusiasm
and
dedication to the method that will not be duplicated in general use.
Hence,
almost all published tests of new techniques are positive.

With this one, the problem really is the
infrastructure/upkeep/record-keeping that goes along with effective
use.
When new and novel, an instructor will put in the time and effort.
What
about year two and three?

Rick

------------------------------

Date: Fri, 11 Feb 2005 10:45:58 -0600
From: Joseph Bellina <jbellina@SAINTMARYS.EDU>
Subject: Re: Research on Student Response Systems

I note that Lesley or Leslie, I don't recall which, University in
Boston, I believe, seems to be a popular reviewer of educational
systems. I know that the Educational Development Center in Newton has
a
NSF grant to evaluate the effectiveness of standards based science kits
in elementary classrooms. They have also assessed how well school
districts have sustained curricular change.

So it seems to me that there are examples of people doing meta-studies.
The problem seems to be finding money people who think enough of the
issue to pay someone to do the study.

cheers,

joe