This Summer, my center is supporting a cohort of 24 graduate students who are teaching for the first time. They’ve participated in our teaching and learning class, The College Classroom, and we strongly encourage them use evidence-based, student-centered instructional activities in their classes.
We work a lot on peer instruction (PI) with clickers so that’s a natural choice. We’re thrilled that 12 of the 24 chose to use peer instruction with i>clickers, in physics, linguistics, engineering, philosophy, marketing, psychology, cognitive science, math, management, and economics.
There were some instructors in visual arts, communications, anthropology and other disciplines who wanted to use PI but didn’t want to use clickers. Their reasons were understandable:
- it’s a small class (8-10 students) so the instructor didn’t need the reward of participation points to get students to engage. The instructor can just “look ’em in the eye” when they’re not participating.
- the students’ cost of buying a clicker
- the overhead of having to learn the software (and how to make it play nice with the UCSD course management system). They’re teaching for the first time, creating all content from scratch, without a TA to mark essays, in a compressed, 5-week course that meets twice a week for 3-hour classes.
- the desire to pose more open-ended questions where there is neither a right answer nor 3-5 common responses. Questions like, “Do you the person who painted this picture was a woman or a man? Why?” (Sure, you could make that a clicker question “Do you think a woman or a man painted this? A) woman B) man” but that’s just a survey and you don’t need clickers for that.)
I met with each instructor before they started teaching to talk about their plans. One instructor in Visual Arts suggested using think-pair-share. That’s got a lot in common with peer instruction. Actually, since TPS has been around for ages, peer instruction has a lot to thank TPS for. In TPS, recall
- the instructor poses a thought-provoking question
- students think on their own
- students pair with neighbors to discuss their thoughts
- students and the instructor share the thoughts in a class-wide discussion
Let’s compare that to a good episode of PI in a discussion-based class. That’s one where every choice in the question is plausible and the goal of the activity is to get students to pick a prompt they’re comfortable with and explain it to their neighbors, citing evidence when possible. That is, there’s no “convincing your neighbor you’re right” because all the answers are right. Okay, so here’s what PI looks like:
- the instructor poses a thought-provoking question with 2-5 conversation starters for choices
- students vote using their clickers
- instructor says, “Hmm, really interesting to see you choosing different prompts. Please turn to your neighbor, tell them why you picked the choice you made. Support your choice with evidence from the readings.”
- the students pair and discuss
- there is NOT a 2nd vote – no one is expected to change their minds. The discussion was a chance to summon the evidence and practice putting together an argument.
- the instructor leads a lively, class-wide discussion drawing out the students’ evidence for each of the prompts
My colleague and historian, Heidi Keller-Lapp, adds one more step. When she’s preparing the class, she adds a slide after the PI question with a list of all the points she wanted to cover via the PI question. After step 6, Heidi
- flips to the discussion points slide, goes down the list, “Yep, we talked about this and this and this and, oh, we didn’t mention this. Okay, remember…. Good, and this and this. Great! Terrific discussion, everyone.” This can take 20 minutes in Heidi’s class. That’s 20 glorious minutes of students thinking critically and making arguments with evidence.
What makes peer instruction effective?
There are a couple of necessary, though not sufficient, components of effective peer instruction.
- students must think on their own and commit to an idea. That’s critical for learning because they need something to talk about, something to contribute to the “turn to your neighbor” and something to XOR their neighbor’s thinking against.
- students engage more when they know they’re accountable. Participation points – points for clicking – are a good way to support this. A few points go a long way.
And that’s what is often missing in TPS unless the instructor has the presence and respect of the students to get them all to engage each time. In TPS,
- students don’t need to commit: they can look at the prompts and think, “Hmm, a couple of those look plausible,” wait until their neighbor starts talking, and then respond, “Yeah, that’s totally what I was thinking, too.” They can get away with it.
- so what if a student doesn’t pick a prompt? What’s the instructor going to do about it? Cold-call on students? That’s not TPS anymore; it’s anxiety-inducing, imposter-syndrome-reinforcing arm-twisting. Ask for students to raise their hands? Sure, and the same 3 students answer (and I don’t have to talk, ever, if I don’t want to.)
Introducing TPS/cards
Okay, back to Vis Arts. When we brainstormed how to do peer instruction without clickers (What’s that you say, use ABCD voting cards? Two words: card fade. And see 5 below), we stumbled onto a variation of TPS that, I believe, resolves these weaknesses by borrowing from PI:
- the instructor poses a thought-provoking question. It can be open-ended. It can be multiple-choice. It can even be “Draw a picture of…” or “Sketch a graph of…” Whatever the instructor decides will provoke the best discussion.
- students think on their own and write their thoughts on 3 x 5 inch index cards that the instructor distributes every day. By writing on the card, students commit to one of the choices. (Bonus: writing!)
- students pair with neighbors to discuss their thoughts, referring to their index cards as necessary
- students and the instructor share the thoughts in a class-wide discussion
- at the end of class, students hand in their index cards (after writing their names on them). The instructor uses these cards to award participation points. Yes, this takes time that scales with the size of the class. But does flipping through a stack of cards, putting tally marks on a class list, really take that much longer than syncing your clicker software with course management system (don’t forget, there is no frustrating, pull-your-hair-out battle with freakin’ Blackboard! Arrggghh! at the beginning of the term.)
Super Bonus: Education Research
Like any experimental teaching and learning activity, we need to ask, “But did it work?” We have a post-course student survey that probes deeply how student perceived and learned from peer instruction, and we’re running essentially same survey in these TPS/cards classes with “peer instruction” search-and-replaced with “think-pair-share.” I’m really excited to see how the courses taught with TPS/cards turn out.
Double Super Bonus
The instructor kept all the index cards from her classes, in chronological order. She’s going to run some content analysis on the students’ thoughts to see if, for example, their thinking grew more sophisticated and expert-like as the course progressed. An awesome teaching-as-research project!
Your thoughts
What do you think? Have I missed something critical about PI or added something harmful to TPS? Is this something school teachers have been doing for decades and HigherEd is only now re-inventing it? What research question would you try to answer if you had a record of what your students were thinking throughout the term? All ideas welcome!