Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
-----Original Message-----
From: Forum for Physics Educators
[mailto:PHYS-L@list1.ucc.nau.edu] On Behalf Of Rick Tarara
Sent: Friday, February 11, 2005 11:31 AM
But how do you even do a meta-study? In the case of the
response systems--do you go out and pick ten instructors at
random working with a range of courses and say "OK, next year
you must use this tool. Keep these records, and then we'll
come in and test your students." Well of course not. You
can collect data from people using method A,B, or C but then
you are back to the 'enthusiast' bias.
In the end, I think this is just a very tough area to apply
'scientific' testing to. We can't do double blind
experiments in most cases. No placebos to put out there. No
really 'neutral' presenters. Way, way too many variables.
Teaching is still an art form. As Joel said, use your common
sense. If a tool or technique makes sense to you, and if you
are willing to put in the time and effort into learning how
to do use it, it is worth trying [assuming you are
unsatisfied with what you are currently doing]. This
particular example may be on the fringe of that advice
however, because of the cost involved, so I can understand
why someone looking at it would want as much information as
possible. However, 'research' here is unlikely to be very
meaningful, so in the end, I'd look to personal testimonials
of people you trust.
Rick