Published Online: August 30, 2005


How Do We Know When to Believe Testing Data?

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints

To the Editor:

Your July 27, 2005, issue has two front-page articles that seem to contradict one another. In "Efforts Seek Better Data on Graduates," you report that statistics on graduation rates reported by different states are unreliable. In one instance, a state reported a graduation rate of 97 percent, but researchers estimated a rate closer to 64 percent. This is a glaring discrepancy that suggests outright cheating.

Yet, in another article, "South Posts Big Gains on Long-Term NAEP in Reading and Math," you seem to accept as fact that “a generation of reform measures in the Southeastern states appears to be paying off in higher student achievement.”

How do you know? Surely you must realize that the No Child Left Behind Act has imposed tremendous pressures on educators across the country. Principals, in particular, are in danger of losing their jobs when test scores do not improve. Many, if not all, schools are therefore teaching to the test—or actually teaching the test itself—from September to May.

Are the higher scores reported by the National Assessment of Educational Progress “proof that No Child Left Behind is working,” to quote U.S. Secretary of Education Margaret Spellings? Or is this another cheating scandal like the “Texas miracle”?

Linda Mele Johnson
Long Beach, Calif.

Back to Top Back to Top

Most Popular Stories