IPR Scholars Led Workshop on R&D in STEM Education
Larry Hedges and Elizabeth Tipton trained researchers in the latest evaluation methods
Get all our news
IPR statistician Larry Hedges leads a training during the Improving Evaluations of R&D in STEM Education Summer Institute.
How can researchers conduct small education studies when large-scale experimental methods are difficult to implement?
This was one of the key questions IPR statisticians Larry Hedges and Elizabeth Tipton addressed during the Improving Evaluations of R&D in STEM Education Summer Institute from July 11–15, 2022. Funded by the National Science Foundation, the five-day workshop focused on methods for conducting small studies, and 24 scholars from across the country attended the workshop on Northwestern’s Evanston campus.
This was the second year Hedges and Tipton hosted the institute, but it was the first time the workshop met in person since the pandemic. Sessions focused on an introduction to causal inference methods, including methods for randomized experiments and methods for matching to control for selection bias. Causal inference methods are often used in medical and education policy research, but they are less common in the small studies found in STEM education research.
“This workshop addresses a neglected area of research design,” Hedges said. “For many years we have conducted a workshop on the design and analysis of large-scale randomized experiments, but there has been less emphasis on the design of rigorous small scale evaluation studies.”
Hedges and Tipton run the Statistics for Evidence-Based Policy and Practice, or STEPP Center, whose mission is to “improve lives through methodological innovation in research.” Part of its mission also involves training researchers and making methodological research and knowledge more accessible to scholars.
“In order to improve the information available to education decision-makers, we need for researchers to be able to conduct high quality studies,” Tipton said. “Our workshops provide the scaffolding and community that researchers need to learn these state-of-the-art methods.”
Workshop attendees collaborated on group projects and interacted with an online community of scholars, a support group that will continue after the workshop. Hedges and Tipton explain that this community of researchers can offer mutual support and collaboration as they carry out their evaluation work.
Motunrayo Olaniyan, a senior research associate at the Hope Center for College, Community, and Justice at Temple University, says she attended the Institute because she wanted to refine her research skills and guide her work as she prepares for an upcoming STEM intervention at her own center. Olaniyan’s research focuses on racial equity in higher education, including factors that influence students’ likelihood of enrolling and completing college.
“The training completely reshaped my understanding of how to conduct randomized-control trials (RCTs),” Olaniyan said. “I developed an understanding of how to estimate the minimally detectable effect size for each type of randomized design. I also learned the value of utilizing a quasi-experimental design and statistical adjustments for when randomization is not feasible.”
For Sutandra Sarkar, an instructor and doctoral candidate at Georgia State University, the workshop gave her the opportunity to learn how to design a research study, collect data, and analyze data for small efficacy studies. Sarkar monitors the performance of a math tutoring lab at Georgia State University, and she plans to use insights from the trainings to design and scale studies evaluating student success in the program. The group work also allowed her to build a network of researchers from various fields and backgrounds.
“I found the group work to be valuable in skill building and getting to know some of the participants more closely,” Sarkar said. “The presentations from each group were very informative and helped us learn the concepts of effect size, threats to the validity, design selection process in a meaningful way.”
Olaniyan notes that she benefited as well from meeting researchers from various backgrounds, and she highly recommends the training for scholars seeking to engage in rigorous discussion of RCTs.
“I found that the institute helped me better navigate my existing evaluations,” Olaniyan said. “Meeting in-person also allowed me to meet researchers from different fields, which facilitated idea sharing to generate new topics for future evaluations.”
Larry Hedges is Board of Trustees Professor of Statistics and Education and Social Policy. Elizabeth Tipton is associate professor of statistics. Both are IPR fellows.
Photo credit: IPR
Published: September 7, 2022.