Tag: clickers

Motivation for pre-reading assignments

Image: chain by pratani on flicker (CC)

For the next 4 months, I’ll be working with an instructor in an 4th-year electromagnetism course. If you’ve taught or taken a course like this, let me just say, “Griffiths”. If you haven’t, this is the capstone course in E&M. It’s the big, final synthesis of all the electricity and magnetism and math and math and math the students have been accumulating for the previous 3-1/2 years. This is where it all comes together and the wonders of physics are, at last, revealed. It’s the course all the previous instructors have been talking about when they say, “Just learn it. Trust me, it will be really important in your future courses…” That’s the promise, anyway.

The instructor came to us (“us” being the Carl Wieman Science Education Initiative) because he wasn’t happy with the lecture-style he’s been using. Students are not engaging, if they even bother to come to class. He’s trying to use peer instruction with clickers but it’s not very successful. He wants to engage the students by giving them worksheets in class but he’s not sure how.

So much enthusiasm! So much potential! Yes, let’s totally transform this course, flipping it from instructor- to student-centered! Yes, and I purposely using the word “flipping” with all its baggage!

Hold on there, Buckaroo! One thing at a time. Changing everything at once rarely works. It takes time for the instructor to make the changes and learn how to incorporate each one into his or her teaching.

So, we’re tackling just a few things this term. The first is to create learning goals (or objectives) so we can figure out how to target our effort. In talking with the instructor, I learned there are very few new, mathematical techniques introduced in the course. Instead, the course is about selecting the right sequence of mathematical tools to distill fundamental physics out of the math describing E&M. That led us to this draft of one of the course-level, big-picture goals:

While you are expected to remember basic relationships from physics like F=dp/dt and λ=c/ν, you do not have to memorize complicated formulas we derive in class because a list of formulas will be given. Instead, you will be able to select the applicable formula from the list and know how to apply it to the task you’re working on.

The biggest change we’re making is the introducing effective pre-reading assignments. Oh sure, the instructor always said things like “Pre-reading for Lecture 1: Sections 12.1.1 – 12.1.3” but that’s not doing the trick. More and more of my colleagues are having success with detailed, targeted reading assignments. Rather than the “read the whole thing and learn it all” approach, we’re going to help the students learn (ha! Imagine that!):

Reading assignment (prior to L1 on Thu, Jan 10)
==================

Read Section 12.1.1. Be sure you can define an "inertial reference frame"
and state the 2 postulates of special relativity.

Review Section 12.1.2 (these concepts were covered in previous courses)
especially the Lorentz contraction (iii) and write out the missing steps
of algebra at the top of p. 490 that let Griffiths "conclude" Eqn (12.9).
Be sure you can explain why dimensions perpendicular to the velocity are
not contracted.

Read Section 12.1.3. Look carefully at Figure 12.16 so you're familiar
with the notation for inertial frames at rest (S) and inertial frames in
motion ( S with an overbar )

Now comes the hard part: getting the students to actually do it. It’ll take effort on their part so they should be rewarded for that effort. A reading quiz, probably in-class using clickers, worth marks could be that reward. (An online quiz we can use for just-in-time teaching might be even better but one thing at a time.) A straightforward quiz-for-marks promotes sharing answers (that is, cheating) and clicking for students not there (that is, cheating). I don’t want them to participate for that sole reason that they’ll be punished for not participating. I’d rather use a carrot than of a stick.

How do we present the pre-reading assignment as something the students WANT to do? Here’s a chain of reasoning, developed through conversations with my more-experienced colleagues. It’s addressed to the students, so “you” means “you, the student sitting there in class today. Yes, you.”

link 1: Efficient. You have a very busy schedule full of challenging courses. You want to use your E&M time efficiently.

link 2: Effective. We want the time you have allocated to E&M to be effective, a good return on your investment.

link 3: Learning. We recognize that many of the concepts will be learned when you do the homework. But rather than using class time to simply gather information for future learning, what if you could actually learn in class? Then you’d better follow along in class and you’d already be (partially, at least) prepared to tackle the homework.

link 4: Engagement. We’re going to create opportunities for you to learn in class through engaging, student-centered instructional strategies. But you need to be prepared to participate in those activities.

link 5: Preparation. To try to ensure everyone has neighbours prepared to collaborate and peer-instruct, we’re asking you to complete the pre-reading assignment. It will also save us from wasting valuable class time reviewing material that some (most?) of you already know.

link 6: Reward. This takes some effort so we’re going to reward that effort. If you do the readings as we suggest, the reading quiz questions we ask will be simple, a 5-mark gimme towards your final grade. Oh sure, you’ll be allowed to miss X of the quizzes and still get the 5%. Those marks are for getting into the habit of preparing for class, not a penalty for being sick or not being able to come class. The quizzes are also continuous feedback for you: if you’re not getting 80% or more on the reading quizzes, you’re not properly preparing for class. Which means you’re not link 5, 4, 3, 2, 1.

The big message should be, your effort in the pre-reading assignments will help you succeed in this course, not just with a higher grade but with better grasp of the concepts and fewer all-nighters struggling with homework.

Is it all just a house of cards? I don’t think so. And I’ll find out in the next few weeks.

Effective professional development, Take 1

The other day, I participated in a webinar run by Stephanie Chasteen (@sciencegeekgirl on Twitter. If you don’t follow her, you should.) It was called, “Teaching faculty about effective clicker use” and the goals was to help us plan and carry out meetings where we train faculty members to use peer instruction and clickers. Did you get that subtle difference: it was not about how to use clickers (though Stephanie can teach you that, too.) Rather, this webinar was aimed at instructional support people tasked with training their colleagues how to use peer instruction. This was a train the trainers webinar. And it was right up my alley because I’m learning to do that.

And if you think that’s getting meta-, just you wait…

In the midst of reminding us about peer instruction, Stephanie listed characteristics of effective professional development. She gave us the bold words; the interpretation in mine:

  • collaborative: it’s about sharing knowledge, experiences, ideas, expertise
  • active: we need to do something, not just sit and listen (or not!)
  • discipline-oriented: If we want to be able to share, we need some common background. I want to understand what you’re talking about. And I hope you give a damn about what I’m talking about. Coming from the same discipline, like physics or astronomy or biology, is a good start.
  • instructor-driven: I take this to mean “facilitated”. That is, there’s someone in charge who drives the activity forward.
  • respectful: So open to interpretation. Here’s my take: everyone in the room should have the opportunity to contribute. And not via the approach, “well if you’ve got something to say, speak up, dammit!” It takes self-confidence and familiarity and…Okay, it takes guts to interrupt a colleague or a conversation to interject your own opinion. Relying on people to do that does not respect their expertise or the time they’ve invested by coming to the meeting.
  • research-based: One of the pillars of the Carl Wieman Science Education Initiative (CWSEI) that I’m part of at UBC, and the Science Education Initiative at the University of Colorado where Stephanie comes from, is a commitment to research-based instructional strategies. We care about the science of teaching and learning.
  • sustained over time: We’d never expect our students to learn concepts after one exposure to new material. That’s why we give pre-reading and lectures and peer instruction and homework and midterms and…So we shouldn’t expect instructors to transform their teaching styles after one session of training. It requires review and feedback and follow-up workshops and…

Alright, time to switch to another stream for a moment. They’ll cross in a paragraph or two.

(image: Peter Newbury)

I’ve got a big box of shiny new i>clicker2 clickers to try out. I’m pretty excited. I’m also pretty sure the first thing instructors will say is, “What’s with all the new buttons? I thought these things were supposed to be simple! Damn technology being shoved down our [grumble] [grumble] [grumble]” I want to be able reply

Yes, there are more buttons on the i>clicker2. But let me show you an amazing clicker question you can use in your [insert discipline here] classroom…

 

Good plan. Okay, let’s see: Clickers? Check. Amazing clicker questions? D’oh!

We use a lot of peer instruction here at UBC and there are CWSEI support people like me in Math, Chemistry, Biology, Statistics, Earth and Ocean Sciences, Computer Science. If anyone can brainstorm a few good questions, it’s this crew. And guess what? We get together for 90-minute meetings every week.

Can you feel the streams are coming together. Just one more to add:

My CWSEI colleagues and I frequently meet with instructors and other faculty members. We’re dance a delicate dance between telling instructors what to do, drawing out their good and bad experiences, getting them to discover for themselves what could work, (psst: making them think they thought of it themselves). Their time is valuable so when we meet, we need to get things done. We need to run short, effective episodes of professional development. It’s not easy. If only there was a way to practice…

A-ha! Our weekly meetings should be effective professional development led by one of us getting some practice at facilitating. The streams have crossed. I’ll run the next meeting following Stephanie’s advice, modeling Stephanie’s advice, to gather questions so I will be able run an effective workshop on taking advantage of the new features of the i>clicker2. It’s a meta-meeting. Or a meta-meta-meeting?

It’s not like I made any of this up. Or I couldn’t find it if I talked with some people whose job is professional development. Well, I guess I did kind of talk with Stephanie. But there’s a lot to be said for figuring it out for yourself. Or at least starting to figure it out for yourself, and failing, and then recognizing and appreciating what the expert has to say.

And you’ve read enough for now. Watch for another post about how it went.

Peer instruction workshop: the post-mortem

About a week ago, my colleague Cyn Heiner (@cynheiner) and I ran an all-morning-and-into-the-afternoon workshop on effective peer instruction using clickers. I wrote about preparing for the workshop so it’s only fitting that I write this post-mortem.

If “post-mortem” sounds ominous or negative, well, the workshop was okay but we need to make some significant changes. For all intents and purposes, the workshop we delivered is, indeed, dead.

This was our (in hindsight, ambitious) schedule of events:

Schedule for our workshop, "Effective peer instruction using clickers."

The first part, demonstrating the “choreography” of running an effective peer instruction episode, went pretty well. The participants pretend to be students, I model the choreography for 3 questsions while Cyn does colour commentary (“Did you notice? Did Peter read the question aloud? No? What did he do instead.”) The plan was, after the model instruction, we’d go back and run through the steps I took, justifying each one. It turned out, though, that the workshop participants were more than capable of wearing both the student hat and the instructor hat, asking good questions about what I was doing (not about the astronomy and physics in the questions). By the time we got to the end of the 3rd question, they’d asked all the right questions and we’d given all the justification.

We weren’t agile enough, I’m afraid, to then skip the next 15 minutes of ppt slides when we run through all the things I’d done and why.

Revised workshop: address justification for steps as they come up, then very briefly list the steps at the end, expanding only on the things no one asked about.

In the second part of the workshop, we divided the participants into groups of 2-3 by discipline — physics, chemistry, earth and ocean sciences — and gave them a topic about which they should make a question.

Topics for peer instruction questions. (Click to enlarge.)

We  wrote the topics on popsicle sticks and handed them out. This worked really well because there was no time wasted deciding on the concept the group should address.

We’d planned to get all those questions into my laptop by snapping webcam pix of the pages they’d written, and then have each group run an episode of peer instruction using their own question while we gave them feedback on their choreography. That’s where things went to hell in a handcart. Fast. First, the webcam resolution wasn’t good enough so we ended up scanning, importing smart phone pix, frantically adjusting contrast and brightness. Bleh. Then, the questions probed the concepts so well, the participants were not able to answer the questions. Almost every clicker vote distribution was flat.

One group created this question about circuits. A good enough question, probably, but we couldn't answer it in the workshop.
These are the votes for choices A-E in the circuits question. People just guessed. They are not prepared to pair-and-share so the presenter did not have the opportunity to practice doing that with the "students."

The presenters had no opportunity to react to 1 overwhelming vote or a split between 2 votes or any other distribution where they can practice their agility. D’oh! Oh, and they never got feedback on the quality of their questions — were the questions actually that good? We didn’t have an opportunity to discuss them.

We were asking the participants to create questions, present questions, answer their colleagues’ questions AND assess their colleagues’ peer instruction choreography. And it didn’t work. Well, d’uh, what were we thinking? Ahh, 20/20 hindsight.

With lots of fantastic feedback from the workshop participants, and a couple of hours of caffeine-and-scone-fueled brainstorming, Cyn and I have a new plan.

Revised workshop: Participants, still in groups of 2-3, study, prepare and then present a clicker question we created ahead of time.

We’ll create general-enough-knowledge questions that the audience can fully or partially answer, giving us a variety of vote distributions. Maybe we’ll even throw in some crappy questions, like one that way too easy, one with an ambiguous stem so it’s unclear what’s being asked, one with all incorrect choices… We’d take advantage of how well we all learn through contrasting cases.

To give the participants feedback on their choreography, we’ll ask part of the audience to not answer the question but to watch the choreography instead. We’re thinking a simple checklist will help the audience remember the episode when the time comes to critique the presentation. And that list will reinforce to everyone what steps they should try to go through when running an effective peer instruction episode.

The participants unanimously agreed they enjoyed the opportunity to sit with their colleagues and create peer instruction questions. Too bad there wasn’t much feedback, though. Which leads to one of the biggest changes in our peer instruction workshop

2nd peer instruction workshop: Creating questions

We can run another workshop, immediately after the (New) Effective peer instruction or stand-alone, about writing questions. We’re still working out the details of that one. My first question to Cyn was, “Are we qualified to lead that workshop? Shouldn’t we get someone from the Faculty of Education to do it?” We decided we are the ones to run it, though:

  • Our workshop will be about creating questions for physics. Or astronomy. Or chemistry. Or whatever science discipline the audience is from. We’ll try to limit it to one, maybe two, so that everyone is familiar enough with the concepts that they can concentrate on the features of the question.
  • We’ve heard from faculty that they’ll listen to one of their own. And they’ll listen to a visitor from another university who’s in the same discipline. That is, our physicists will listen to a physicist from the University of Somewhere Else talking about physics education. But our instructors won’t listen to someone from another faculty who parachutes in as an “expert.” I can sort of sympathize. It’s about the credibility of the speaker.

Not all bad news…

Cyn and I are pretty excited about the new workshop(s). Our bosses have already suggested we should run them in December, targeting the instructors who will start teaching in January. And I got some nice, personal feedback from one of the participants who said he could tell how “passionate I am about this stuff.”

And, most importantly, there’s a physics and astronomy teaching assistants training workshop going on down the hall. It’s for TA’s by TA’s and many of the “by TA’s” were at our workshop. Now they’re training their peers. These people are the future of science education. I’m proud to be a part of that.

 

Navigation