Tag: index cards

Working with a diverse group? Try a card sort.

Education technology? Yep.
Education technology? Yep.

I went to a day-long retreat where the participants, about 20 of us, were deliberately selected to represent a wide range of backgrounds, experiences, and expertise – all the stakeholders in big project. The retreat organizer suggested each person prepare a 5-10 minute presentation about what they’ll bring to the project and what they’re hoping to get out of it. I was there to represent the teaching and learning support my center provides to instructors.

I had nightmar—, uh, visions of participant after participant clicking through PPT after PPT. The educator in me didn’t want that to happen so I decided to do something active to give my colleagues a better understanding of what I do. They would experience it rather than listen to me describe it. You know, active learning.

(For the record, PPT after PPT was NOT what happened. People talked and distributed some hand-outs. Better that I was prepared, though.)

That’s when I remembered a really interesting and engaging activity I did during a workshop from Kimberly Tanner: card sorting. The idea is, you give each group of 2-4 participants a short stack of cards. Not playing cards but, for example, 9 index cards, one item on each card. In Kimberly’s workshop, the cards were 9 different superheroes. You ask the groups to sort the cards into categories — any categories they want — with just a couple of rules: there has to be at least 2 categories; there can’t be 9 categories (ie, you can’t put each card in its own category.) Well, there are more rules but that’s all I needed for my version.

Then something interesting happens. You’ve carefully chosen the cards so that the items have both surface features (these are superheroes with primarily green costumes, these are mostly blue, these mostly red) and deep features (these are Marvel superheroes, these are DC.) How people sort the cards reveals their level of familiarity and expertise with the content, and gives each participant ample opportunity to share that knowledge with their group-mates.

Back to my card sorting task: I made 9 cards, each one giving the name of a course, the course description, and the format of the course meetings (lectures, labs, discussions, seminars, online, etc.) Thanks, btw, to my colleague Dominique Turnbow for the great advice about what to put on the cards.

So, 9 courses. Please sort them into more than 2 but less than 9 categories:

Participants sorted these 9 cards into categories. Each card describes a different course. (Photo: Peter Newbury CC-BY)
Participants sorted these 9 cards into categories. Each card describes a different course. (Photo: Peter Newbury CC-BY)

There were lots of surface features that could be used:

  • STEM vs Social/Behavioral/Economic Sciences vs Arts & Humanities
  • those with discussion sections vs those with labs
  • which UC San Diego Division they fit in: Biological Sciences, Physical Sciences, Engineering, Social Sciences, Arts & Humanities, Medicine, etc.
  • (I forgot to put class size on the cards – d’oh! – but that would be another way to sort them: small, medium, large, ridiculous enrolment)

I was expecting some of those “surface” sorts but my colleagues blew through those surface features and quickly re-sorted based on deeper features. Honestly, the categories they invented and the categories I made up ahead of time (in case they needed an example) are mixed up in my memory but here are some deeper features (analogous to “color of superhero costume” and “superhero publisher”)

  • technology enhanced
  • amount of active learning in typical classes
  • computationally-focused
  • amount of  close reading required
  • use statistics
  • amount of writing required

Well?

We took 5-10 minutes to sort and then another 10 minutes to report out. Sure, I went over my 10-minute slot but the schedule was very flexible (by design).

I think the activity went great. It gave participants, many of whom were strangers to each other, an opportunity to share their backgrounds and expertise with each other. It revealed the breadth of knowledge in the room. And it gave everyone involved a reminder to look past the surface features of our meeting and project – who will be responsible for this or that, how many offices will be required, what budget will this come from – and look at the big picture: supporting learning.

Details about implementation

(These details are mostly for me so I’ll remember what to do next time. If you’re thinking about running a card-sorting activity, you might find them helpful, too.)

  • I started with a spreadsheet to help me select sufficient courses that covered the surface and deeper features I wanted. I printed it out and had it with me during the activity so I could remember why I’d included the courses and what I anticipated as surface / deeper features.
  • I wrote the course descriptions in Word as 2″ x 4″ labels, printed the labels, and stuck them to index cards. This made it easy to create as many stacks as I needed:
I made 8 sets of cards. What do you notice about the stacks? (Photo: Peter Newbury CC-BY)
I made 8 sets of cards. What do you notice about the stacks? (Photo: Peter Newbury CC-BY)
  •  What do you notice about the stacks? Right, the missing corners. Each stack has a different missing corner so I can easily reset the cards into stacks. Can you imagine the tedious task of sorting 8 x 9 = 72 virtually identical cards into stacks? No, thank-you!
  • There were 2 main camps of people at the retreat, plus a number of important “third parties.” As I began the activity, I formed groups of 2-3 with at least one person from each camp.
  • I used some old fridge magnets to make 9 magnets, one for each course. When the groups reported out, I quickly arranged the magnets on a handy whiteboard so I could hold it up for the others in the room to see:
When the groups reported out, I quickly arranged the magnets on a handy whiteboard so I could hold it up for the others in the room to see. (Photo: Peter Newbury CC-BY)
When the groups reported out, I quickly arranged the magnets on a handy whiteboard so I could hold it up for the others in the room to see. (Photo: Peter Newbury CC-BY)

Think-Pair-Share meets Peer Instruction

This Summer, my center is supporting a cohort of 24 graduate students who are teaching for the first time. They’ve participated in our teaching and learning class, The College Classroom, and we strongly encourage them use evidence-based, student-centered instructional activities in their classes.

We work a lot on peer instruction (PI) with clickers so that’s a natural choice. We’re thrilled that 12 of the 24 chose to use peer instruction with i>clickers, in physics, linguistics, engineering, philosophy, marketing, psychology, cognitive science, math, management, and economics.

There were some instructors in visual arts, communications, anthropology and other disciplines who wanted to use PI but didn’t want to use clickers. Their reasons were understandable:

  • it’s a small class (8-10 students) so the instructor didn’t need the reward of participation points to get students to engage. The instructor can just “look ’em in the eye” when they’re not participating.
  • the students’ cost of buying a clicker
  • the overhead of having to learn the software (and how to make it play nice with the UCSD course management system). They’re teaching for the first time, creating all content from scratch, without a TA to mark essays, in a compressed, 5-week course that meets twice a week for 3-hour classes.
  • the desire to pose more open-ended questions where there is neither a right answer nor 3-5 common responses. Questions like, “Do you the person who painted this picture was a woman or a man? Why?” (Sure, you could make that a clicker question “Do you think a woman or a man painted this? A) woman B) man” but that’s just a survey and you don’t need clickers for that.)

I met with each instructor before they started teaching to talk about their plans. One instructor in Visual Arts suggested using think-pair-share. That’s got a lot in common with peer instruction. Actually, since TPS has been around for ages, peer instruction has a lot to thank TPS for. In TPS, recall

  1. the instructor poses a thought-provoking question
  2. students think on their own
  3. students pair with neighbors to discuss their thoughts
  4. students and the instructor share the thoughts in a class-wide discussion

Let’s compare that to a good episode of PI in a discussion-based class. That’s one where every choice in the question is plausible and the goal of the activity is to get students to pick a prompt they’re comfortable with and explain it to their neighbors, citing evidence when possible. That is, there’s no “convincing your neighbor you’re right” because all the answers are right. Okay, so here’s what PI looks like:

  1. the instructor poses a thought-provoking question with 2-5 conversation starters for choices
  2. students vote using their clickers
  3. instructor says, “Hmm, really interesting to see you choosing different prompts. Please turn to your neighbor, tell them why you picked the choice you made. Support your choice with evidence from the readings.”
  4. the students pair and discuss
  5. there is NOT a 2nd vote – no one is expected to change their minds. The discussion was a chance to summon the evidence and practice putting together an argument.
  6. the instructor leads a lively, class-wide discussion drawing out the students’ evidence for each of the prompts

My colleague and historian, Heidi Keller-Lapp, adds one more step. When she’s preparing the class, she adds a slide after the PI question with a list of all the points she wanted to cover via the PI question. After step 6, Heidi

  1. flips to the discussion points slide, goes down the list, “Yep, we talked about this and this and this and, oh, we didn’t mention this. Okay, remember…. Good, and this and this. Great! Terrific discussion, everyone.” This can take 20  minutes in Heidi’s class. That’s 20 glorious minutes of students thinking critically and making arguments with evidence.

What makes peer instruction effective?

There are a couple of necessary, though not sufficient, components of effective peer instruction.

  • students must think on their own and commit to an idea. That’s critical for learning because they need something to talk about, something to contribute to the “turn to your neighbor” and something to XOR their neighbor’s thinking against.
  • students engage more when they know they’re accountable. Participation points – points for clicking – are a good way to support this. A few points go a long way.

And that’s what is often missing in TPS unless the instructor has the presence and respect of the students to get them all to engage each time. In TPS,

  • students don’t need to commit: they can look at the prompts and think, “Hmm, a couple of those look plausible,” wait until their neighbor starts talking, and then respond, “Yeah, that’s totally what I was thinking, too.” They can get away with it.
  • so what if a student doesn’t pick a prompt? What’s the instructor going to do about it? Cold-call on students? That’s not TPS anymore; it’s anxiety-inducing, imposter-syndrome-reinforcing arm-twisting. Ask for students to raise their hands? Sure, and the same 3 students answer (and I don’t have to talk, ever, if I don’t want to.)

Introducing TPS/cards

indexcards Okay, back to Vis Arts. When we brainstormed how to do peer instruction without clickers (What’s that you say, use ABCD voting cards? Two words: card fade. And see 5 below), we stumbled onto a variation of TPS that, I believe, resolves these weaknesses by borrowing from PI:

  1. the instructor poses a thought-provoking question. It can be open-ended. It can be multiple-choice. It can even be “Draw a picture of…” or “Sketch a graph of…” Whatever the instructor decides will provoke the best discussion.
  2. students think on their own and write their thoughts on 3 x 5 inch index cards that the instructor distributes every day. By writing on the card, students commit to one of the choices. (Bonus: writing!)
  3. students pair with neighbors to discuss their thoughts, referring to their index cards as necessary
  4. students and the instructor share the thoughts in a class-wide discussion
  5. at the end of class, students hand in their index cards (after writing their names on them). The instructor uses these cards to award participation points. Yes, this takes time that scales with the size of the class. But does flipping through a stack of cards, putting tally marks on a class list, really take that much longer than syncing your clicker software with course management system (don’t forget, there is no frustrating, pull-your-hair-out battle with freakin’ Blackboard! Arrggghh! at the beginning of the term.)

Super Bonus: Education Research

Like any experimental teaching and learning activity, we need to ask, “But did it work?” We have a post-course student survey that probes deeply how student perceived and learned from peer instruction, and we’re running essentially same survey in these TPS/cards classes with “peer instruction” search-and-replaced with “think-pair-share.” I’m really  excited to see how the courses taught with TPS/cards turn out.

Double Super Bonus

The instructor kept all the index cards from her classes, in chronological order. She’s going to run some content analysis on the students’ thoughts to see if, for example, their thinking grew more sophisticated and expert-like as the course progressed. An awesome teaching-as-research project!

 Your thoughts

What do you think? Have I missed something critical about PI or added something harmful to TPS? Is this something school teachers have been doing for decades and HigherEd is only now re-inventing it? What research question would you try to answer if you had a record of what your students  were thinking throughout the term? All ideas welcome!

Navigation