To the Editor:
In his Commentary titled “The What Works Clearinghouse: Time for a Fresh Start” (Dec. 19, 2007), Robert E. Slavin gives specific data on curricular programs he feels got a “pass” from the federal clearinghouse based on poor research. If Mr. Slavin can give such data against what appear to be more traditionally oriented programs, why doesn’t he offer similar data to support his assertion that, with respect to the clearinghouse’s review of math programs, “instructional-process programs excluded by the clearinghouse have the strongest positive effects in the most rigorous evaluations”?
I keep hearing that such studies exist, but can never get the details of them. Mr. Slavin’s effort to influence a thinking reader gets lost in the shuffle of his own selective use of data.
In fact, why doesn’t he write about those unrewarded, thoroughly researched instructional-process programs in greater detail, rather than point out bad programs that got good reports? It seems that doing so would have given better “advertising” to the ones Mr. Slavin likes.
As a retired elementary school principal and high school math teacher, I get really tired of the “figures don’t lie; liars figure” routine that those in the trenches have to decipher from those outside the schools. And people wonder why teachers don’t use “researched” methods.
A version of this article appeared in the January 09, 2008 edition of Education Week as Were Data Used Selectively In Clearinghouse Essay?