To the Editor:
As a former director of one of the National Science Foundation’s Local Systemic Change Through Teacher Enhancement programs, I am writing to clarify and expand upon several key points regarding the Horizon Research Inc. evaluation study of the NSF program (“NSF Educator-Training Effort Seen as Helpful,” March 8, 2006.).
Contrary to the comments you quote from Michael Marder, a co-director of the U-Teach program at the University of Texas at Austin, the evaluation directly addressed the impact of professional development on teachers’ content knowledge and pedagogy. It included more than 1,600 classroom observations examining classroom practice. Evaluators for each project rated lessons of a random sample of teachers in their project, using a common observation protocol.
These cross-site, independent, standardized classroom observations showed that the program’s professional development produced improvements in some key instructional areas, including the quality of teacher questioning and the inclusion of sense-making in lessons.
The study also produced valuable tools for practitioners: the classroom-observation protocol discussed above, as well as an observation protocol for determining the quality of professional-development sessions. These protocols are useful both in establishing a common vision of quality instruction and professional development and as formative-assessment tools to improve the quality of instruction and professional development.
The NSF program’s cross-site evaluation study has produced a wealth of information, as well as tools useful to those of us who design, implement, and evaluate professional-development programs. The complete study, available on Horizon Research’s Web site, is worth careful review.
Diane J. Briars
Senior Program Officer
Mathematics and Science Education
Pittsburgh Public Schools
A version of this article appeared in the March 29, 2006 edition of Education Week as NSF Program Evaluation Reaped Many Benefits