Tag: physics

Peer instruction workshop: the post-mortem

About a week ago, my colleague Cyn Heiner (@cynheiner) and I ran an all-morning-and-into-the-afternoon workshop on effective peer instruction using clickers. I wrote about preparing for the workshop so it’s only fitting that I write this post-mortem.

If “post-mortem” sounds ominous or negative, well, the workshop was okay but we need to make some significant changes. For all intents and purposes, the workshop we delivered is, indeed, dead.

This was our (in hindsight, ambitious) schedule of events:

Schedule for our workshop, "Effective peer instruction using clickers."

The first part, demonstrating the “choreography” of running an effective peer instruction episode, went pretty well. The participants pretend to be students, I model the choreography for 3 questsions while Cyn does colour commentary (“Did you notice? Did Peter read the question aloud? No? What did he do instead.”) The plan was, after the model instruction, we’d go back and run through the steps I took, justifying each one. It turned out, though, that the workshop participants were more than capable of wearing both the student hat and the instructor hat, asking good questions about what I was doing (not about the astronomy and physics in the questions). By the time we got to the end of the 3rd question, they’d asked all the right questions and we’d given all the justification.

We weren’t agile enough, I’m afraid, to then skip the next 15 minutes of ppt slides when we run through all the things I’d done and why.

Revised workshop: address justification for steps as they come up, then very briefly list the steps at the end, expanding only on the things no one asked about.

In the second part of the workshop, we divided the participants into groups of 2-3 by discipline — physics, chemistry, earth and ocean sciences — and gave them a topic about which they should make a question.

Topics for peer instruction questions. (Click to enlarge.)

We  wrote the topics on popsicle sticks and handed them out. This worked really well because there was no time wasted deciding on the concept the group should address.

We’d planned to get all those questions into my laptop by snapping webcam pix of the pages they’d written, and then have each group run an episode of peer instruction using their own question while we gave them feedback on their choreography. That’s where things went to hell in a handcart. Fast. First, the webcam resolution wasn’t good enough so we ended up scanning, importing smart phone pix, frantically adjusting contrast and brightness. Bleh. Then, the questions probed the concepts so well, the participants were not able to answer the questions. Almost every clicker vote distribution was flat.

One group created this question about circuits. A good enough question, probably, but we couldn't answer it in the workshop.
These are the votes for choices A-E in the circuits question. People just guessed. They are not prepared to pair-and-share so the presenter did not have the opportunity to practice doing that with the "students."

The presenters had no opportunity to react to 1 overwhelming vote or a split between 2 votes or any other distribution where they can practice their agility. D’oh! Oh, and they never got feedback on the quality of their questions — were the questions actually that good? We didn’t have an opportunity to discuss them.

We were asking the participants to create questions, present questions, answer their colleagues’ questions AND assess their colleagues’ peer instruction choreography. And it didn’t work. Well, d’uh, what were we thinking? Ahh, 20/20 hindsight.

With lots of fantastic feedback from the workshop participants, and a couple of hours of caffeine-and-scone-fueled brainstorming, Cyn and I have a new plan.

Revised workshop: Participants, still in groups of 2-3, study, prepare and then present a clicker question we created ahead of time.

We’ll create general-enough-knowledge questions that the audience can fully or partially answer, giving us a variety of vote distributions. Maybe we’ll even throw in some crappy questions, like one that way too easy, one with an ambiguous stem so it’s unclear what’s being asked, one with all incorrect choices… We’d take advantage of how well we all learn through contrasting cases.

To give the participants feedback on their choreography, we’ll ask part of the audience to not answer the question but to watch the choreography instead. We’re thinking a simple checklist will help the audience remember the episode when the time comes to critique the presentation. And that list will reinforce to everyone what steps they should try to go through when running an effective peer instruction episode.

The participants unanimously agreed they enjoyed the opportunity to sit with their colleagues and create peer instruction questions. Too bad there wasn’t much feedback, though. Which leads to one of the biggest changes in our peer instruction workshop

2nd peer instruction workshop: Creating questions

We can run another workshop, immediately after the (New) Effective peer instruction or stand-alone, about writing questions. We’re still working out the details of that one. My first question to Cyn was, “Are we qualified to lead that workshop? Shouldn’t we get someone from the Faculty of Education to do it?” We decided we are the ones to run it, though:

  • Our workshop will be about creating questions for physics. Or astronomy. Or chemistry. Or whatever science discipline the audience is from. We’ll try to limit it to one, maybe two, so that everyone is familiar enough with the concepts that they can concentrate on the features of the question.
  • We’ve heard from faculty that they’ll listen to one of their own. And they’ll listen to a visitor from another university who’s in the same discipline. That is, our physicists will listen to a physicist from the University of Somewhere Else talking about physics education. But our instructors won’t listen to someone from another faculty who parachutes in as an “expert.” I can sort of sympathize. It’s about the credibility of the speaker.

Not all bad news…

Cyn and I are pretty excited about the new workshop(s). Our bosses have already suggested we should run them in December, targeting the instructors who will start teaching in January. And I got some nice, personal feedback from one of the participants who said he could tell how “passionate I am about this stuff.”

And, most importantly, there’s a physics and astronomy teaching assistants training workshop going on down the hall. It’s for TA’s by TA’s and many of the “by TA’s” were at our workshop. Now they’re training their peers. These people are the future of science education. I’m proud to be a part of that.

 

CWSEI End of Year Conference

Every April, at the end of the “school year” at UBC, the Carl Wieman Science Education Initiative (CWSEI) holds a 1-day mini-conference to highlight the past years successes. This year, Acting-Director Sarah Gilbert did a great job organizing the event. (Director CW, himself, is on leave to the White House.) It  attracted a wide range of people, from UBC admin to department heads, interested and involved faculty, Science Teaching and Learning Fellows (STLFs) like myself and grad students interested in science education. The only people not there, I think, were the undergraduate students, themselves. Given that the event was held on the first day after exams finished and the beginning of 4 months of freedom, I’m not surprised at all there weren’t any undergrads. I know I wouldn’t have gone to something like this, back when I was an undergrad.

Part 1: Overview and Case Studies

The day started with an introduction and overview by Sarah, followed by 4 short “case studies” where 4 faculty members who are heavily involved in transforming their courses shared their stories.

Georg Rieger talked about how adding one more activity to his Physics 101 classes made a huge difference. He’s been using peer instruction with i>Clickers for a while and noticed poor student success on the summative questions he asked after explaining a new concept. He realized students don’t understand a concept just because he told them about it, no matter how eloquent or enthusiastic he was. So he tried something new — he replaced his description with worksheets that guided the students through the concept. It didn’t take a whole lot longer for the students to complete the worksheets compared to listening to him but they had much greater success on the summative clicker questions. The students, he concluded, learn the concepts much better when they engage and generate the knowledge themselves. Nice.

Susan Allen talked about the lessons she learned in a large, 3rd-year oceanography class and how she could apply them in a small, 4th-year class. Gary Bradfield showed us a whole bunch of student-learning data he and my colleague Malin Hansen have collected in an ecology class (Malin’s summer job is to figure out what it all means.) Finally, Mark MacLean described his approach to working with the dozen or so instructors teaching an introductory Math course, only 3 of whom had any prior teaching experience. His breakthrough was writing “fresh sheets” (he made the analogy to a chef’s specials of the week) for the instructors that outlined the coming week’s learning goals, instructional materials, tips for teaching that content, and resources (including all the applicable questions in the textbook.) The instructors give the students the same fresh sheet, minus the instructional tips. [Note: these presentations will appear on the CWSEI shortly and I’ll link to them.]

Part 2: Posters

All of my STLF colleagues and I were encouraged to hang a poster about a project we’d been working on. Some faculty and grad students who had stories to share about science education also put up posters.

My poster was a timeline for a particular class in the introductory #astro101 course I work on. The concept being covered was the switch from the Ptolemaic (Earth-centered) Solar System to the Copernican (Sun-centered) Solar System. The instructor presented the Ptolemaic model, described how it worked, asked the students for to make a prediction based on the model (a prediction that does not match the observations, hence the need to change models.) The students didn’t get it. But he forged onto the Copernican model, explained how it worked, asked them to make a prediction (which is consistent with the observations, now). They didn’t get that either. About a minute after the class ended, the instructor looked at me and said, “Well that didn’t work, did it?” I suggested we take a Muligan, a CTRL-ALT-DEL, and do it again the next class. Only different this time. That was Monday. On Tuesday, we recreated the content switching from an instructor-centered lecture to a student-centered sequence of clicker questions and worksheets.  On Wednesday, we ran the “new” class. It took the same amount of time and the student success on the same prediction questions was off the chart! (Yes, they were the same questions. Yes, they could have remembered the answers. But I don’t think a change from 51% correct on Monday to 97% on Wednesday can be attributed entirely to memory.)

Perhaps the most interesting part of the poster, for me, was coming up with the title. The potential parallel between Earth/Sun-centered and instructor/student-centered caught my attention (h/t to @snowandscience for making the connection.) With the help of my tweeps, wrestled with the analogy, finally coming to a couple of conclusions. One, the instructor-centered class is like the Sun-centered Solar System (with the instructor as the Sun):

  • the instructor (Sun) sits front and center in complete control while “illuminating” the students (planets), especially the ones close by.
  • the planets have no influence on the Sun,…
  • very little interaction with each other,…
  • and no ability to move in different directions.

As I wrote on the poster, “the Copernican Revolution was  a triumph for science but not for science education.” I really couldn’t come up with a Solar System model for a student-centered classroom, where students are guided but have “agency” (thanks, Sandy), that is, the free-will, to choose to move (and explore) in their own directions. In the end, I came up with (yes, it’s a mouthful but someone stopped me later to compliment me specifically on the title)

Shifting to a Copernican model of the Solar System
by shifting away from a Copernican model of teaching

Part 3: Example class

When we were organizing the event, Sarah thought it would be interesting to get an actual instructor to present an actual “transformed” class, one that could highlight for the audience (especially the on-the-fence-about-not-lecturing instructors) what you can do in a student-centered classroom. I volunteered the astronomy instructor I was working with, and he agreed. So Harvey (and I) recreated a lecture he gave about blackbody radiation. I’d kept a log of what happened in class so we didn’t have to do much. In fact, the goal was to make it as authentic as possible. The class, both the original and the demo class, had a short pre-reading, peer instruction with clickers (h/t to Adrian at CTLT for loaning us a class set of clickers), the blackbody curves Lecture-Tutorial worksheet from Prather et al. (2008), and a demo with a pre-demo prediction question.

Totally rocked, both times. Both audiences were engaged, clicked their clickers, had active discussions with peers, did NOT get all the questions and prediction correct.

At the CWSEI event, we followed the demonstration with a long, question-and-answer “autopsy” of the class. Lots of great questions (and answers) from the full spectrum of audience members between novice and experienced instructors. Also some helpful questions (and answers) from Carl, who surprised us by coming back to Vancouver for the event.

To top it off, we made the class even more authentic by handing out a few Canadian Space Agency stickers to audience members who ask good questions, jus

Canadian Space Agency (CSA) or Agence spatiale canadienne (ASC) logo

t like we do in the real #astro101 class. You should have seen the glee in their eyes. And the “demo” students went all metacognitive on us (as they did in the real class, eventually) and started telling Harvey and I who asked sticker-worthy questions!

Part 4: Peer instruction workshop

The last event of the day was a pair of workshops. One was about creating worksheets for use in class. The other, which I lead, was called “Effective Peer Instruction Using Clickers.” (I initially suggested, “Clicking it up to Level 2” but we soon switched to the better title.)  The goal was to help clicker-using instructors to take better advantage of peer instruction. So many times I’ve witnessed teachable moments lost because of poor clicker “choreography,” that is, conversations cut-off, or not even started, because of how the instructor presents the question or handles the votes, and other things. Oh, and crappy questions to start with.

I didn’t want this to be about clickers because there are certainly ways to do peer instruction without clickers. And I didn’t want it to be a technical presentation about how to hook an i>clicker receiver to your computer and how to use igrader to assign points.

Between attending Center of Astronomy Education peer instruction workshops myself, which follow the “situated apprentice” model described by Prather and Brissenden (2008), my conversations with @derekbruff and the #clicker community, and my own experience using and mentoring the use of clickers at UBC, I easily had enough material to fill a 90-minute workshop. My physics colleague @cynheiner did colour-commentary (“Watch how Peter presents the question. Did he read it out loud?…”) while I did a few model peer instruction episodes.

After these demonstrations, we carefully went through the choreography I was following, explaining the pros and cons. There was lots of great discussion about variations. Then the workshop turned to how to handle some common voting scenarios. Here’s one slide from the deck (that will be linked shortly.)

I’d planned on getting the workshop participants to get into small groups, create a question and then present it to the class. If we’d had another 30 minutes, we could have pulled that off. Between starting late (previous session went long) and it being late on a Friday afternoon, we cut off the workshop. Left them hanging, wanting to come back for Part II. Yeah, that’s what we were thinking…

End-of-Year Events

Sure, it’s hard work putting together a poster. And demo lecture. And workshop. But it was a very good for the sharing what the CWSEI is doing, especially the demo class. And I’ll be using the peer instruction workshop again. And it was a great way to celebrate a year’s work. And then move onto the next one.

Does your group hold an event like this? What do you find works?

Don't forbid phones in class, embrace them

It’s not uncommon to hear, as I wander the halls at UBC, faculty complaining about students preoccupied with their computers and phones in class. The most common solution is to just ignore it (“if they don’t want to pay attention to the class, it’s their loss…”) Can’t disagree with that, as long as students aren’t distracting others who are trying to pay attention. Another solution is to ban computers and phones. Well, some students legitimately need their computers (students with disabilities, for example) so I know of a few instructors who ask these students to sit over there, off to the side.

But here’s another solution: don’t forbid phones in class, embrace them.

Naive? Perhaps. Impossible to faciliate? Ye— Ah! Not so fast!

The April 2011 issue of The Physics Teacher contains an article by Angela M. Kelly that describes a collection of iPod Touch apps (which should also on iPhone and iPad) and how to use them to teach Newton’s Laws of Motion.  Cool idea: use the games the students are already playing to teach them physics.

I want to add to her list my own favourite physics app. This one’s not a game so it might not – no, who am I kidding, will not –  have the same appeal. But xSensor (which, at the time I write this, is free!) is a great physics app because it gives a real-time readout of the accelerometer, in the x-, y- and z-directions. The pix below are screenshots from my iPhone (captured with that magical “click on/off and home buttons at the same time” feature.) Here are a couple of screenshots that show some cool physics. The app will also record the data in a log you can email yourself.

xSensor screenshot showing circular motion. The sinusoidal curves encode the constant centripetal force.

I made this one by putting my iPhone flat on my desk and swirling it around and around. The curves sweeps across the screen recording about 5 seconds of readings. The numbers on the screen, 0.02, -0.14 and -1.18 are the instantaneous accelerations measured in g’s.  The z-acceleration is pretty constant at -1 g. Can’t get rid of gravity… The accelerations in the x-direction and y-direction show beautiful sinusoidal motion, 90 degrees out-of-phase, encoding the centripetal force of the phone’s circular motion. It’s shaky because I can’t swirl my phone smoothly.

Okay, the “can’t get rid of gravity…” line was a strawman. Because you can. If you drop your phone. Which I did. Very carefully.

xSensor screenshot during free fall when, for a brief moment, the phone recorded zero acceleration.

These graphs show me holding my phone still. About halfway through the plot, I dropped it. For a short period of time, the acceleration in z-direction snaps up to zero g’s: free fall! Then there’s a big blip as I clumsily catch my phone and take the screenshot. But there, just for that moment in free fall, my phone appeared to be force free. That’s Einstein’s Principle of Equivalence: floating free in deep space is just like freely falling in a gravitational field. (That NASA link include the famous Apollo 15 hammer/feather drop video.) It’s not a Gedankenexperiment, though. It’s the real thing, right there in your hand! Well, you know what I mean.

So, don’t ban phones from your physics, astronomy or science classrooms: embrace them! Better yet, chuck ’em across the room!

Do you have a favourite physics app? Have you discovered another cool experiment you can do with xSensor? Hope you’ll share it with us.

Navigation