Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
Put your question in the context of applicants for positions onJack Uretsky <jlu@hep.anl.gov> 11/27/09 10:21 PM >>>
Is anyone else bothered by the policy that reinforces the "smart" vs. "dumb" kids scores? I can imagine that there are valid questions that otherwise higher-scoring students miss more frequently that otherwise lower-scoring students. If such questions exist, then systematically rejecting such questions reduces the validity of the entire test._______________________________________________
Daniel Crowe
Loudoun County Public Schools
Academy of Science
dan.crowe@loudoun.k12.va.us
Posted on PTSOS by Dean Baird:Bernard Cleyet <bernardcleyet@redshift.com> 11/26/09 11:35 PM >>>
The process by which an item makes it on to the CST is ... involved.
Add a few more steps for those that get released.
Items are developed by ETS and screened by the California Department
of Education and by its Assessment Review Panel. If that all goes
well, the item is field-tested. The item must perform well on a
series of psychometric measures. Most importantly, the item must be
neither too easy nor too hard, based on student performance on the
item. And the item must discriminate well. That is, students who
perform well on the test overall should perform well on the item. And
students who don't perform well on the test overall should perform
poorly on the item. There are always some items that "smart" kids get
wrong and "dumb" kids get right. Those items are rejected.
<snip>
_______________________________________________
Forum for Physics Educators
Phys-l@carnot.physics.buffalo.edu
https://carnot.physics.buffalo.edu/mailman/listinfo/phys-l