State Tests, NAEP Often a Mismatch
Bars Defining 'Proficient' Unaligned, Study Shows
Many of the states that claim to have large shares of their students reaching proficiency in reading and mathematics under the No Child Left Behind Act have set less stringent standards for meeting that threshold than lower-performing states, a new federal study finds.
The study drew an immediate and strong reaction from many public officials and education advocates, who said it laid bare states’ vastly divergent standards for testing students.
The report judges states’ reading and math tests against a common yardstick: the proficiency standards used by the National Assessment of Educational Progress, often referred to as “the nation’s report card.”
Released last week by the National Center for Education Statistics, the analysis appears to back up the suspicions of those who have cast a skeptical eye on state data showing high percentages of students reaching the “proficient” level in reading and math.
But researchers who were asked by the Council of Chief State School Officers to review the study’s methodology cited what they see as flaws in comparing two dissimilar sets of exams: NAEP and those administered by states.
Even so, U.S. Secretary of Education Margaret Spellings called it “sobering news” as the nation seeks to raise academic demands on students.
States “must do their part by setting high standards and expectations,” she said in a statement. “I hope this report will be a catalyst for positive change.”
The study was issued June 7, two days after a separate report by an education policy group showing that student scores on state tests have risen since the enactment of the No Child Left Behind Act, which President Bush signed into law in January 2002. ("State Tests Show Gains Since NCLB," June 6, 2007.)
The NCES study, “Mapping 2005 State Proficiency Standards Onto the NAEP Scales,” doesn’t attempt to compare the difficulty of individual state tests in reading and math, or how they compare with the national assessment, which is given to representative samples of students.
Instead, it compares where states set minimum scores for determining whether students are proficient, under the mandates of the No Child Left Behind Act, against the bar set by NAEP on the 4th and 8th grade reading and math tests. The study uses NAEP data from the 2004-05 administration of the assessment and gives states a “NAEP score equivalent.”
Under the NCLB law, states are required to test students annually in reading and math in grades 3-8 and once in high school, and report the percentages of students achieving proficiency on those exams. Schools must make adequate yearly progress in those subjects or face increasingly stiff penalties.
The study found that state tests varied greatly in judging students as proficient—between 60 and 80 points—when placed on the NAEP 500-point scale.
“That’s absolutely huge,” said Eric A. Hanushek, a senior fellow at the Hoover Institution at Stanford University who co-wrote the report on student tests for the Center on Education Policy. “You’re talking about very large differences in how states are measuring the performance of kids.”
In 4th grade reading, the NCES study found wide variation in states’ standards for proficiency when judged against the NAEP scale.
Massachusetts, South Carolina, Wyoming, Arkansas, and Connecticut, in that order, had the five highest “NAEP score equivalents.” In other words, Massachusetts’ proficiency standard matched the NAEP standard for 4th grade reading better than any other state’s. Mississippi had the worst score equivalents in that subject and grade, followed by Tennessee, Georgia, Alaska, and Oklahoma.
Call to Candidates
Reaction to the federal study came from many fronts.
The Education Trust, an influential Washington group that advocates high standards and vigorous efforts to help disadvantaged children meet them, said “students and their parents are being given a false sense of promise,” which is that “children are being prepared to meet the real-world challenges of college and careers.”
The response took on a political, but bipartisan, dimension when Ken Mehlman, the former chairman of the Republican National Committee, and Roy R. Romer, a former general chairman of its Democratic counterpart, issued a joint statement urging the 2008 presidential candidates to make education a priority in their campaign platforms.
But two researchers who reviewed the study’s methods for the Council of Chief State School Officers questioned the reliability of linking NAEP with state exams, in two separate papers.
Andrew Ho, of the University of Iowa, and Edward Haertel of Stanford University, cited the “large body of literature” pointing to differences in the purposes and design of state tests and NAEP. As many researchers have noted, states craft tests and set achievement levels for them based on where most students are likely to score, and to set realistic goals for improvement, they said. NAEP tests, by contrast, serve “lofty, long-term goals,” they write, with much higher achievement levels.
One of the authors of the NCES study, Henry Braun, said in an interview that even though NAEP is “an imperfect” standard for judging state proficiency standards, it is still a useful one.
“We need to ask if [this] kind of variation is in the interest of the nation and its schools,” said Mr. Braun, a professor of education at Boston College.
Officials in Tennessee, which scored poorly in the nces study, were not surprised by the findings, said Rachel Woods, a spokeswoman for the state education department. State leaders already recognize its bar for judging proficiency is too low, and they are revising Tennessee’s content standards and exams to fix it, she said.
“This is stuff we already figured out two years ago, and we’re working on it,” Ms. Woods said.
Jim Rex, the superintendent of schools in South Carolina, among the highest-scoring states in the NCES study, said the result might help the public in his state better understand why the number of students reaching the proficient level on state exams is relatively low.
“We’ve had trouble in this state that proficiency scores were low because our standards were so high,” Mr. Rex said. “I’d like to see this information given to parents annually in all states. They can draw the conclusions they want to.”
Vol. 26, Issue 41, Pages 1,23
- Head of School
- Augusta Preparatory Day School, Martinez, GA
- Director of Auxiliary Programming
- Lovett School, Atlanta, GA
- Chief Academic Officer
- Cristo Rey Network, Chicago, IL
- Director of College Counseling
- Augusta Preparatory Day School, Martinez, GA
- Director of Technology
- St. Paul's School for Girls, Brooklandville, MD