For almost a decade, researchers at the University of Michigan (U-M) have been working on ways to help our students reach their fullest potential by combining social psychological interventions with cutting-edge educational technology. Our new paper, Real-World Effectiveness of a Social-Psychological Intervention Translated from Controlled Trials to Classrooms, published this week in npj Science of Learning (a Nature journal), summarizes one strand of our collective efforts.
In 2013, as a doctoral student in psychology at U-M, Patrica Chen applied her expertise in psychological theory to design precise interventions that promote goal achievement in education, work, and health. In partnership with Dr. Brenda Gunderson – the Golden Apple Award-winning instructor of Stats 250, the single largest course at the University of Michigan – Chen began testing early instances of a metacognitive intervention that helped students choose which resources they would use to study for an exam, reflect on why those resources will help them, and create a plan to do so.
Meanwhile, U-M physics professor Tim McKay (now Associate Dean for Undergraduate Education at the College of Literature, Arts & Sciences at U-M) was overseeing the development of ECoach, a tool that uses theory and techniques inspired by health behavior interventions to support student success, including expert-tailored messages and just-in-time interventions. ECoach combines campus data and instructor insights to help students succeed in courses that are typically large and impersonal, but necessary for advancing within STEM careers. ECoach is one of a suite of technologies developed by the U-M Center for Academic Innovation.
These parallel pursuits led to a natural collaboration. Dr. Chen showed that a simple metacognitive intervention could help students better prepare for exams resulting in, on average, a third of a letter grade boost (Chen et al., 2017, Psych Science). With this research in hand, Ben Hayward (lead developer) and Holly Derry (lead behavioral scientist) of the ECoach team partnered closely with her to build a user-friendly version of her intervention, called the Exam Playbook, into the ECoach platform (Huberth, Chen et al., 2015, PLOS One).
Since then, Chen and the ECoach team have deployed the intervention in 7 unique courses that use high-stakes exams as an assessment mechanism. Each course has its own unique context to understand how the intervention works. Some instructors offer extra credit for Exam Playbook completion, and some don't. Some courses offer Playbooks for two, three, or four exams. All of this demonstrates the messy nature of real-world (as opposed to lab) research and why it's so hard and so important to conduct rigorous research on effectiveness in these varied, natural contexts. Our new paper depicts the results from this cross-domain and multi-year scale-up, analyzing data from more than 12,000 students enrolled in large STEM courses. The courses studied included Introductory Statistics, Introductory Biology, General Chemistry, General Physics, Introductory Programming, and Introductory Economics.
Exam Playbook asks students to select from a list of resources they plan to use to study for an upcoming exam, and write out why they believe each resource will be useful. Across all classes, an average of 44% of students engaged with the Exam Playbook to prepare for at least one of the exams in the course. After controlling for standardized test scores, students who engaged with the intervention had an average of a 2 point increase on their exam score as a result. The effect was positive in 13 out of 14 of the classes studied (one semester of the Physics course did not show a positive effect), and was particularly effective in Statistics, where student exam grades were an average of almost 6 points higher than their peers who chose not to use the tool. Students who used the tool for one exam and then chose to skip it for the next did an average of just under 2 points worse than students who chose to use it again–this shows just how valuable and easy it is to stop and make an intentional plan for how you’ll approach high stakes exams.
Key to our collaboration has been our shared goal: combining research, data, and technology to support students at scale. We have worked closely to bring together our respective expertise across teaching, research, motivation theory, behavioral science, and software development. This cross-functional collaboration allows us to tackle messy, real-world problems identified by teachers, create research- and theory-based interventions to address them, and build cutting-edge technology to support the students who need them most at the right time.