Skip to main content

IPR Researchers Lead Summer Workshops for Education Scholars

Larry Hedges and Beth Tipton share latest education research methods

Get all our news

Subscribe to newsletter

For many of the participants, the institute has helped them broaden their professional skills and provide new opportunities to pursue research.”

Larry Hedges
IPR statistician

CRT workshop attendees standing in front of a building for a group photo

Attendees from the Research Training Institute on Cluster-Randomized Trials workshop gather for a photo on July 25, 2024. 

The Statistics for Evidence-Based Policy and Practice (STEPP) Center was founded in 2019 to bridge the gap between statisticians, researchers, and policymakers and better translate scientific evidence to improve education policy. Part of STEPP’s mission is to share rigorous research methods to improve the quality of scholarship on education interventions.

Over the summer, the STEPP Center ran two workshops to train education researchers on contemporary evaluation methods. Over 60 researchers from around the country attended the workshops on Northwestern’s campus, led by IPR statisticians Larry Hedges and Elizabeth Tipton, along with evaluation researcher Eric Hedberg.

Improving Evaluations of R&D in STEM Education 2024 Summer Institute

Between July 8 and July 12, STEM education researchers who attended the Improving Evaluations of R&D in STEM Education Summer Institute immersed themselves in a variety of modern research methods and explored how they might use them in their own studies. The National Science Foundation funded the workshop, which was led by Hedges and Hedberg.

The workshop specifically focused on smaller studies—of, say, 10 classrooms— in which large-scale experimental methods become difficult to implement. Sessions examined methods for inferring cause and effect, confounding (when other factors affect the results), and how to make sure research is valid. They also covered ways to reduce selection bias as well as pretest, posttest, and difference-in-difference study designs, with a focus on their use in small field experiments and pilot studies.

Educational researchers also discussed the concepts of trade-offs, or how researchers balance competing goals in study design; design sensitivity, how well a study can detect what’s having an effect; effect size reporting, showing the strength of a study’s results; and questionable research practices that might make a study invalid. Workshop attendees explored these topics in lectures, group projects, individual assignments, and discussions in an online community, which will continue after the workshop to help researchers implement the new methods they learned. 

“This training has allowed me to think about applied research differently. Special education rarely utilizes [randomized controlled trial] research, as interventions are so individualized and our populations are so small that getting enough power is difficult,” participant Amanda Duncan said. “However, Dr. Hedges and Hedberg gave me a different perspective as to how this can be achieved. Thanks to this workshop, I will be applying for funding to conduct a school-based RCT shortly.”

Hedberg stressed how important it is to make these research concepts available to everyone working in education research to build capacity and shed light on what’s happening in schools nationwide.

“I am proud to be working with Drs. Hedges and Tipton on workshops like this that educate a wider slice of the field, and I am confident that efforts such as this only adds to the potential to strengthen our knowledge base around education practices,” Hedberg said. 

Research Training Institute on Cluster-Randomized Trials

To prepare scholars to better evaluate the impact of education policies and interventions, Hedges and Tipton co-organized a two-week in-depth training institute, supported by a grant from the National Center for Education Research in the Institute of Education Science of the U.S. Department of Education. 

The institute ran between July 15 and July 25 and focused on training researchers to plan, carry out, and analyze data for cluster-randomized trials. These trials assign whole groups of people, like classrooms or schools, to different conditions rather instead of assigning individuals—which allows researchers to account for group effects when looking at how a given educational intervention affects student achievement.

“Very few education schools offer full courses in the design and analysis of randomized trials," Tipton said. "Since 2007, this workshop has allowed us to provide education researchers with this knowledge, while also helping to build a community of researchers who can support one another in conducting such evaluations."

By the end of the training, participants had an in-depth understanding of how randomized trials work and how they enable researchers to assess cause and effect. They were able to explain how populations in education are organized within levels (for example, students within classrooms within schools) and how that affects research. They also learned how to choose the best ways to measure results, check if a program is being carried out as planned, and track important details about the process. With this knowledge, participants were ready to implement a cluster-randomized trial with confidence.

“The workshop provided valuable information for every step in the process of directing a CRT or RBD evaluation,” Dan Cullinan said, who attended the institute. “Just in the few months since completing the workshop I have drawn on the lessons and resources numerous times in my project work.”

Above all, Hedges said that the workshop allows alumni of the institute to apply the concepts that they learned in their own field and spread the ideas throughout their networks. 

“Many of the alumni of the institute have designed their own field experiments and obtained the funding to carry them out,” Hedges said. “The alumni also tell us that they frequently use what they have learned in the institute in their work, in classes they teach and in their discussions with colleagues.  For many of the participants, the institute has helped them broaden their professional skills and provide new opportunities to pursue research.” 

Find out information about next year's Summer Research Training Institute on Cluster-Randomized Trials from July 14–24.

Larry Hedges is Board of Trustees Professor of Statistics and Education and Social Policy. Elizabeth Tipton is Professor of Statistics and Data Science, and Education and Social Policy (by courtesy). Both are IPR fellows.

Photo credit: Lily Schaffer

Published: December 2, 2024.