Are Test Scores Rising?
Charges of ‘Attribution Confusion’ and ‘Commando Research’
As a co-founder in 1983 of PACE, Policy Analysis for California Education, I want to make clear that the report on state test trends by Bruce Fuller, highlighted in his Education Week Commentary ("Are Test Scores Really Rising?" Oct. 13, 2004), was not published by PACE. Indeed, the Web site (http://pace.berkeley.edu) cited in Mr. Fuller’s essay is hosting various views on this issue, including Mr. Fuller’s research as one of several perspectives. In addition, PACE takes no position on the various assertions in his Commentary.
As someone who has studied education policy for over 40 years, I believe it is premature to make definitive judgments on the success of the No Child Left Behind Act, a federal law that is not quite 3 years old. I participated in several studies of the Elementary and Secondary Education Act’s Title I program and co-authored a 16-year longitudinal summary of its implementation. Multifaceted laws like No Child Left Behind require comprehensive studies of many components before any conclusions can be reached on their impact.
Moreover, a valid analysis of the law’s test-score impact requires data from all states and several analytical approaches. For example, the achievement gap must be measured in a different manner from overall testing trends. Different states define pupils’ proficiency in different ways, making state test trends very difficult to interpret.
PACE is a nonpartisan, nonprofit organization. It is regrettable that attribution confusion over Mr. Fuller’s dual role as a professor at the University of California, Berkeley, and a PACE co-director occurred, especially during a highly charged political debate on the law’s efficacy.
To the Editor:
Presidential elections appear irresistible to some academics who—dying to enter into the political fray—discard ideas of honest scholarship for the attempted quick press hit. In October 2000, a hopelessly bad study released under the RAND Corp. imprimatur tried for an 11th-hour strike at schools in Texas. This October sees Bruce Fuller, a co-director of PACE, Policy Analysis for California Education, producing an outrageously biased and misleading political statement against the No Child Left Behind Act with no apparent concern for its impact on his reputation or that of PACE.
Academics best influence policy debates when they become known for following certain scientific rules that guard against pure advocacy statements designed to pander to one political view or another. Mr. Fuller has given up this position in attempting to rush results to the election debates. The appearance of his Commentary on the day of the final presidential debate aired on national television underscores its purpose.
In the garbled presentation of various “facts” and conclusions about the No Child Left Behind law and student performance, the study by unnamed Stanford and Berkeley researchers violates virtually all rules of science.
Mr. Fuller states that “4th grade reading scores have faltered or declined in 11 of the nation’s 15 most populous states that have published trend data over the past three years.”
Trend analysis, as any college student learns, uses consistent data for a consistent span of years and applies consistent evaluations to each observation. His study of trends in student test scores does none of these.
The study selects a biased sample and misrepresents it. The 15 largest states for which Mr. Fuller collects data include nine of the 15 largest states but also reach down to the 31st-largest state. The smallest of this group, Iowa, not only has produced sporadic data but also has been one of the slowest states to enter into school accountability.
The study is selective in the data employed for each state. Different grades are used across the states. For some states, five years of experience are used, but for others three or four.
Nonetheless, the most outrageous part is the conclusion of 11 of 15 states showing declines or no gains since the No Child Left Behind Act. The evaluations of each state’s experience are whimsical—apparently, whatever suits the desired conclusion. Yet, by the researchers’ own data, the conclusion should be exactly the opposite. Of the states that the study concludes are flat or declining, their own data (which have not been verified) show noticeable gains in California, Illinois, Missouri, New York, Tennessee, Texas, Virginia, and Wisconsin. Add in the four states that even the researchers acknowledge having shown improvement (Florida, Michigan, Minnesota, and Washington) and we see that 12 of 15 states improved by the data cited. The remaining three states (Colorado, Iowa, and Massachusetts) are less clear-cut, but nothing suggests that they lost ground, and they may have actually improved.
Finally, the study follows an unfortunate recent practice—call it commando research—of rushing findings to the press before any other researchers can evaluate the study itself. Having gotten the desired press coverage, I doubt that we will ever see the full study. With the analysis currently represented, it clearly will not appear in a scientific publication.
Vol. 24, Issue 09, Page 43
Vol. 24, Issue 09, Page 43
Get more stories and free e-newsletters!
- Head of School
- Rockwern Academy, OH
- STEM Academy Engineering Teacher
- Lithia Springs High School, Lithia Springs, GA
- Chief of Human Resources Officer
- Milwaukee Public Schools, Milwaukee, WI
- Program Managers (Charter, Partner and New Schools)
- School District of Philadelphia, Phila, PA
- Early Learning Policy Specialist
- City of Seattle, Seattle, WA