Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] science education articles



Daniel

Thanks. I work with upwards of 1000 students per term. However, I
believe this reading log could be adapted into something that is
entered and graded electronically.

What guidance do you provide students on 'how to read a textbook'?

Thanks,
Dr. Roy Jensen
(==========)-----------------------------------------¤
Lecturer, Chemistry
W5-19, University of Alberta
780.248.1808




On Thu, 24 Sep 2015 13:39:35 -0400, you wrote:

I have noted similar issues with our freshman (bright, semi-articulate, super dependent on others and very good at externalizing accountability for their own learning). One strategy we use to address this is to require freshman to read the course text and write some minimally reflective reaction to each chapter via a two-sided, one page reading log <http://physicsed.buffalostate.edu/danmac/ReflectiveWriting/ReadingLogV7.doc>
and have each log turned in for fast cursory grading weekly, 10-12 logs worth about 10% of the final grade (about 1 letter grade). Then student grade is visibly and solidly linked to this expectation for diligent student performance. We do this for our courses where students are expected to start developing strategies for making sense of technical writing although the form does not actually require extended sense-making, which we work on via class time activity.

A suggestion I have for realtime assessment of student understanding is whiteboard use. I’m long part the point where whiteboards are in any way invasive or time-absorbing in my class because I use them for so many other reasons but monitoring and guiding student thinking is certainly something important to me. There are many many videos, URLs and blogs detailing whiteboard use extant at least in HS / Intro physics. I know they are used widely in HS modeling chem, but am unfamiliar with URLs supporting that.

Dan M

Dan MacIsaac, Associate Professor of Physics, SUNY-Buffalo State College
462SciBldg BSC, 1300 Elmwood Ave, Buffalo NY 14222 USA 1-716-878-3802
<macisadl@buffalostate.edu> <http://PhysicsEd.BuffaloState.edu>
Physics Graduate Coordinator & NSF Investigator for ISEP (MSP) and Noyce

On Sep 23, 2015, at 10:09 PM, rjensen@ualberta.ca wrote:

Active instructional strategies only work if the students are *not*
passive learners, and work better the higher the level of the learner.
This is exactly what I address in ComSci-3.pdf
www.RoguePublishing.ca/ComSci-3.pdf

As you might have guessed, I played with flipped instruction. I
flipped about 30 % of the class material -- material they were already
exposed to in high school -- so as to not overwhelm my students and
maintain a diversity of instructional strategies. I teach at a
competitive institution, created instructional videos, gave a quiz on
the flipped material before class, asked students what they were
confused about in the flipped material, gave an brief summary of the
material (10 min) at the beginning of class (focusing on where
students struggled on the quiz and what they say they found
confusing), and then continued with instructor-led and student-led
real-world problems, demonstrations, and current events discussions.
By the end of the term, around 85 % of the students were completing
the flipped-material quizzes (each worth ~1.5 % of their total grade).
The ones that were, were getting close to 100 %.

The most profound observation I made based on the end-of-term
comments: First-year is a big change from high school. At their level
of development, students do not take personal responsibility and look
for any reason to blame others for their poor performance. Since I was
the only one engaged in active instruction and they had never seen it
before, I was the reason they did poorly in chemistry.

Stupidly, I want to try this again. There are things I could do
better. I also want to work with my colleagues to assess the relative
increase in student understanding between sections of flipped and
traditional instruction. What non-invasive and non-time consuming
strategies exist to assess understanding. I am familiar with pre/post
testing. Are there others?

Thanks,
Dr. Roy Jensen
(==========)-----------------------------------------¤
Lecturer, Chemistry
W5-19, University of Alberta
780.248.1808



On Wed, 23 Sep 2015 16:07:51 -0500, you wrote:

The flipped classroom has an obvious problem. The students have to view the
lectures before they come into class, and they have been accustomed to being
spoon fed the lectures. Then the teacher has to get the students to do work
in class, but here the teacher has to be the guide at the side, something
they are unaccustomed to doing, and usually do not know how to do it well.

Why would the students bother to view the lectures before the class?? If
they won't read the book beforehand, why would they view a lecture
beforehand. In addition they might look at a different source than the one
assigned. Mazur solves this problem by giving a reading quiz before each
class, and he has really competitive students.

As to PER what exactly would you quote as being contradictory??? I have
found it to be very consistent. I really do not expect an answer because
those who diss PER have never answered any of my questions. No answer, in
my mind, is essentially an affirmation that they do not know what they are
talking about.

Now as to the problem of education research. It is much harder than medical
research because you are dealing with students who may fight what you are
doing and deliberately work to maximize their gains, not what you want them
to do. In addition it is impossible to do a double blind study. Even a
single blind study is nearly impossible because both the recipient of a
treatment and the deliverer of it know it is different. The closest to a
single blind study was done by Merlyn Mehl who delivered different
treatments to classes that had different languages. That way there was
minimal opportunity for students to compare what they were getting.

And then there is the study by Feuerstein where he delivered his treatment,
and half the students got a teacher designed enrichment. His treatment
raised IQ substantially to average normal, while the other students remained
much lower.

The big problem often comes in because results are often not compared by
either an effect size or normalized gain. Yes, there was improvement, but
is it big enough to be worth shooting for?

John M. Clement
Houston, TX



This is why forums such as this one are invaluable.

People say things here that they can't or won't publish in
the "literature". Here's an example, or maybe two examples in one:

On 09/23/2015 11:40 AM, rjensen@ualberta.ca wrote:

people don't tend to publish negative results. Flipped
instruction is
a current example. Going to conferences, I find many people
have tried
some form of flipped instruction. For most, it fails and/or the
students revolt. And the event is buried, not published.

Note that such publication bias makes it impossible to do a
meaningful meta-analysis.

BTW ... I never said the literature was /uniformly/ bad. I
said the signal-to-noise ration was poor.
There are occasional bright spots. For example, I disagree
with several parts of the following paper, but at least it
admits that there are validity problems and biases to worry about:
http://mazur.harvard.edu/sentFiles/Mazur_424102.pdf


============================

On 09/23/2015 09:26 AM, Joseph Bellina wrote:
You have often disparaged PER, but let me suggest that your
description below is exactly what PER is about.


_______________________________________________
Forum for Physics Educators
Phys-l@www.phys-l.org
http://www.phys-l.org/mailman/listinfo/phys-l
_______________________________________________
Forum for Physics Educators
Phys-l@www.phys-l.org
http://www.phys-l.org/mailman/listinfo/phys-l

_______________________________________________
Forum for Physics Educators
Phys-l@www.phys-l.org
http://www.phys-l.org/mailman/listinfo/phys-l