First, Mazur quickly summarizes the key concepts in the lesson that students were to learn. Then, he poses a question to the class that forces them to think about the information and apply it in a whole new way. Students consider their answer, then respond using a personal response system.
Next, Mazur has students discuss their answers with their peers who are sitting nearby, with students taking turns defending their choices and the reasoning behind them. Then, he poses the question again and has students re-answer—and he finds the percentage of correct answers nearly always increases the second time around, once students have had a chance to discuss the problem with their peers. Finally, he shares the correct answer and the explanation.
The discussion part is key to the success of Mazur’s strategy. It allows the better students in the class to help teach the others—and everyone benefits in the process, he said, citing research from Carnegie Mellon University and elsewhere that suggests peer instruction can lead to better retention.
Mazur demonstrated the process with conference attendees, using a typical problem he might pose to his physics students. After briefly explaining why metal expands when it’s heated—the atoms move more vigorously, and so they spread out because need more space in which to move—he asked attendees to imagine a rectangular piece of metal with a circular hole in the middle: Would the diameter of the hole increase, decrease, or stay the same if the metal were heated uniformly?
Participants logged their responses, and then they huddled to confer with their colleagues. Those who thought the hole would shrink explained that the atoms in the metal around the edge of the hole would want to move away from the atoms in the rectangle’s interior, thereby contracting the hole. But others correctly argued that the atoms around the hole’s edge would not move toward the hole’s center, because that would create even more crowding among themselves; instead, those atoms would move away from the hole’s center in an attempt to create more space for themselves, thereby expanding the circle’s diameter.
When Mazur posed the question a second time, the number of correct responses nearly doubled as a result of this discussion. And when he shared the correct answer, many attendees affirmed they’ll remember this concept much more vividly for having participated in the discussion.
If more than 70 percent of the class gets a question right initially, Mazur moves directly to the answer and the explanation, then poses a more difficult question—because there’s no point in going through the peer instruction process if most students already understand the concept, he explained. Similarly, if fewer than 30 percent of the class is right the first time, the correct answer won’t spread throughout the classroom—and so he revisits the concept, then tries an easier question.
To prepare for class, Mazur develops a set of questions at various levels of difficulty, with the goal of having 30 percent to 70 percent of the class able to answer each question correctly. But there are several challenges to implementing this strategy effectively, he acknowledged—including how to design good questions, optimize the discussion, and manage class time.
To address these challenges, Mazur and two Harvard colleagues have developed a unique software-based system called Learning Catalytics.
The software uses intelligent algorithms and data analytics to improve the quality of questions that instructors can pose. It also helps instructors pair students who gave right and wrong answers during the discussion phase, and it helps instructors know when it’s time to wrap up each phase of the process and move on.
The software platform is device-agnostic, meaning students can log in with whatever mobile device they already own. And it supports many types of questions, so instructors aren’t limited to multiple-choice queries. For instance, responses to open-ended questions can be analyzed by creating a word cloud, and the system also supports numerical or ranking questions, as well as those involving diagrams—such as selecting a point on an image, or drawing a graph of a function.
In addition, the software shows the relative location of everyone in the room, so the instructor can see who gave a right or wrong response, as displayed by red or green icons on the instructor’s computer screen. This allows the instructor to pair students who gave right and wrong answers more easily, which facilitates the peer instruction process, Mazur said.
Developed with funding from the National Science Foundation, Learning Catalytics is being used at Harvard as well as a large state school, a high school, and a medium-sized university, he said. It’s currently available by invitation only, but queries can be directed to Mazur at firstname.lastname@example.org.
- How to ensure digital equity in online testing - July 6, 2022
- ‘Digital skills gap’ threatens innovation - May 30, 2022
- Here’s the biggest mistake educators make with remote learning - December 30, 2020