'No Effects' Reading Study Was Poorly Designed

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments

To the Editor:

Once again, Education Week has published a summary of a study that, with one small exception, suggests “no effects” for programs designed to improve student engagement and depth of reading comprehension ("Study of Reading Programs Finds Little Proof of Gains in Student Comprehension," May 12, 2010). This follows in a long line of “no effects” studies coming from the federal Institute of Education Sciences ("'No Effects' Studies Raising Eyebrows," April 1, 2009).

These reports are frustrating and misleading. A test of the null hypothesis in statistics only allows one to make probabilistic conclusions when there is a positive effect. Unfortunately, the data from these null-result experiments are often reported with the implication that the programs are ineffective, when it is more likely that the design and measurements were poorly controlled and failed to provide a test of the reading programs. There are many ways to obtain no results in large-scale “quasi-scientific” experiments.

Suppose we want to examine the effectiveness of the drug Prozac. Half the subjects are assigned to a treatment condition in which they take Prozac daily, and half the subjects to a placebo condition. If we find no significant difference between the experimental and the control group, can we conclude that Prozac is not effective in treating depression? What if we also know that only 30 percent of the subjects in the treatment group actually took Prozac, and we find out that several subjects in the control condition took Prozac?

Such a study clearly would not allow us to make conclusions about the effectiveness of the drug. Yet this example of “research” is almost a direct parallel to the latest large-scale studies of reading-program effectiveness.

Your newspaper must stop reporting poorly designed, null-result studies with inflammatory headlines such as "Supplementary Reading Programs Found Ineffective" (May 13, 2009). These headlines should instead read “Millions of Federal Dollars Wasted on Poorly Designed Study of Reading.”

Carol M. Santa
John L. Santa
Kalispell, Mont.

Carol M. Santa is the founder and co-owner of Project CRISS, one of the reading programs included in the Institute of Education Sciences study, and a past president of the International Reading Association.

Vol. 29, Issue 35, Page 37

Published in Print: June 16, 2010, as 'No Effects' Reading Study Was Poorly Designed
Related Stories
Notice: We recently upgraded our comments. (Learn more here.) If you are logged in as a subscriber or registered user and already have a Display Name on edweek.org, you can post comments. If you do not already have a Display Name, please create one here.
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories