In April, Annenberg Institute Postdoctoral Fellow Kathleen Lynch presented "Improving STEM Teacher Professional Development and Curriculum: A Meta-Analysis” as part of the Education Department Spring 2019 Speaker Series.
Professional development and curriculum materials constitute two major vehicles for instructional innovation and improving student outcomes, Lynch noted. Following calls in the early 2000s by influential scholars for stronger research into the impact of educational interventions, research portfolios at the Institute for Education Sciences (IES) and the National Science Foundation (NSF) began to reflect a growing interest in research methods that allow causal inference, and in using student outcomes as an indicator of program success. Dollars’ and scholars’ turn in this direction has resulted in a wealth of new studies in the past 15 years that permit rigorous empirical analyses linking program characteristics to student outcomes. Lynch presented a meta-analysis of preK-12 STEM instructional improvement programs, noting a serious need for STEM instruction (only 34% of U.S. eighth graders are performing at or above proficiency in STEM courses) and seeking to understand what content, formats, and activities lead to stronger student outcomes. Lynch set out to address two major questions: How effective are STEM teacher professional development and curriculum materials on student outcomes, and can we identify specific program characteristics related to better or worse student outcomes? This work is particularly timely, as the Every Student Succeeds Act requires that districts receiving Title I funds must adopt “evidence-based interventions,” including programs and strategies proven to be effective in raising student achievement.
Lynch walked her audience through the meta-analysis that she had conducted via a systematic review of research evidence through the collection of 95 qualified studies on hundreds of effect sizes. The studies had to examine the impact of STEM professional development or curriculum for PK-12 teachers past 1988 and had to have student outcome data as well as meeting a list of criteria. The analysis coded for duration, format, focus, activity, implementation guidance, lab experience, curriculum dosage, and proportion of curriculum replaced. 75% of studies included a combination of professional development and curriculum, and 64% focused on math while 36% focused on science. The average number of professional development contact hours was 45 hours over four months to a year, and the highest percentage (42%) of activity type was solving problems and working through curriculum materials (others: observing demonstrations, developing lessons, reviewing sample work, and reviewing own student work).
Lynch’s analysis on the average overall impact of STEM instructional programs was broken down into four categories: assessment type (the highest impact being researcher-developed assessment, followed by state standardized assessment and other standardized tests), program type (the highest impact being a combination of both professional development and curriculum), program focus (the highest impact being integrating technology, followed by content-specific formative assessment), and program format (the highest impact being implementation meetings). The analysis disconfirmed two conventional wisdoms: positive average effects on student achievement and the finding that more professional development contact hours does not lead to better student outcomes. The analysis of studies did confirm three other conventional wisdoms: studying curriculum materials, teacher collaboration, and improving teachers’ content/pedagogical knowledge and understanding of how students learn.
Audience members asked Dr. Lynch several follow-up questions, including whether contact hours (time reported spent on professional development activities such as workshops, coaching, etc.) or the type of work have a greater effect on student outcomes. Lynch responded that studies seem to show that coaching matters more because it targets a particular challenge in that school with those teachers. Evidence will show findings via studying the curriculum, collaborating with teachers, and improving pedagogy and content knowledge. There were limitations to Dr. Lynch’s study, with types of data missing, and future studies will need detailed reporting on district/school context, future studies on “typical” practices occurring on the ground, and future studies that experimentally manipulate the variables associated with improved student outcomes in the moderator analyses. In response to a question about cost, Dr. Lynch acknowledged that she couldn’t code for cost due to a lack of information but noted that the price for implementation would be helpful in future studies. She would also like to gather data in response to a question about controlling for teacher experience and number of years worked.
If you missed Dr. Lynch’s talk, you can watch it online.