This report was prepared for Folkstone: Evaluation Anthropology under contract number 26‐003 by Youth Development Evaluation Alliance (currently The Strategy Group). The content of the publication does not necessarily reflect the views or policies of the New Mexico Public Education Department nor does the mention of trade names, commercial products, or organizations imply endorsement by the New Mexico Public Education Department.
This report is in the public domain. While permission to reprint this publication is not necessary, it should be cited as: Moore, W. P., McGregor, H. A.,& Newbill, S. L. (2009). Investing in teacher capacity: Results from the impact evaluation of the New Mexico Math‐Science Partnership. Kansas City, MO: Youth Development Evaluation Alliance.
Does the provision of content‐specific mathematics professional development lead to changes in classroom practice that affect the mathematics achievement of students? At what levels of duration, intensity and focus must in‐service professional development be in order to begin influencing classroom practice so that changes in achievement can be detected?
A number of studies have documented the relationship between the quality of instruction and the outcomes of students. Teacher training and experience have been found to be associated with student achievement and In‐service teacher professional development appears to positively influence the achievement outcomes of elementary school students. Researchers have noted a distinct curvilinear relationship between teacher experience and student achievement (achievement increases as teacher experience increases for 2 to 5 years, with in‐service training, then levels off (Ferguson, 1991; Darling‐Hammond, 2000; Rockoff, 2003). A recent rigorous review of studies of teacher professional development (Yoon, Duncan, Lee, Scarloss, & Shapley, 2007) has found that teachers who participate in more than 14 hours of professional development have a significant and positive impact on student achievement. Yoon, et al., after reviewing 9 rigorous studies of the affect of teacher professional development on student achievement, concluded that teachers who receive substantial professional development saw improvements of 21 percentile points in student achievement (p. 1).
This report describes the results of an external impact evaluation of the New Mexico Math‐Science Partnerships (MSP) program after two years of implementation. The results of analyses of New Mexico state mathematics assessment scores, teacher professional development logs, content assessments, and student and teacher background and demographic information revealed statistically significant but practically non‐significant improvements in latent growth trajectories of middle school students’ mathematics achievement after two years of exposure to teachers participating in the MSP program.
The findings suggest that 1) the amount, type or focus of professional development support received by MPS teachers was not sufficient to bring about meaningful changes in instructional practice that would have a substantial and policy relevant impact on student mathematics achievement; 2) through multiple advanced statistical methods a coherent and consistent message emerged – student achievement did not change dramatically over the period of time studied by the evaluators – even with the provision of professional development targeted at mathematics content‐specific knowledge and pedagogy. While several key hypotheses could not be tested because of missing data, the two most important evaluation questions and hypotheses did get answered: Does increased professional development make a difference in the mathematics outcomes of students? And, does exposure to MSP teachers provide an incremental advantage in mathematics performance beyond that of non‐MSP participating teachers? In both cases the results from this study provide no compelling evidence that MSP has dramatically changed the mathematics achievement trajectories of students. The estimated mean rate of change was .09 units per year, or about a tenth of a proficiency category a year. While this rate of change was statistically significant, it does not appear to be practically significant. Whether one considers the most policy‐relevant outcome: moving students into the proficient and above category or looking at movement of percentages of students into the next highest proficiency category, MSP does not appear to have meaningfully improved the odds that students will progress above and beyond that of students who were not exposed to MSP.
The evaluation team notes that the complex analyses utilized to discern affects of professional development on student outcomes required a commitment by data owners to fulfill a highly structured and extensive data request ‐‐‐ one that spanned many years, many teachers, many schools, and many students. A relevant observation of the evaluation team experience in this project is that data quality and the fulfillment of data requests by the New Mexico Public Education Department lacked the capacity and resources necessary to provide the extensive data sets required. Data files received by the evaluation team contained duplicate student records, missing data elements, and coding errors contributing to significant delays in initiating analyses. Entire years of student data were not provided and requests for data were not fulfilled for months. The original scope of this evaluation included three MSP partnerships ‐‐‐ only one was responsive to the calls from the external evaluator to participate. The results obtained in this evaluation would have been more robust, compelling, and meaningful had all three partnerships contributed to the evaluation, had the NM PED been consistently responsive and able to fulfill data requests, and actively supporting and strongly urging the participation of MSP partnerships in this important evaluation of the impact of professional development onstudent mathematics achievement.