By guest blogger Catherine Gewertz
The PARCC assessments are generally tougher than the Smarter Balanced tests, ACT Aspire, and the other tests states are using, according to an analysis released Monday.
For his study, author Gary Phillips, a vice president and fellow at the American Institutes for Research, examined the achievement levels, or cut scores, that are intended to show that students are on track in 4th and 8th grade to be college-ready by the time they graduate from high school.
He set out to gauge the difficulty of those cut scores by mapping them to the cut scores used on the National Assessment of Educational Progress, or NAEP. The NAEP offers a common metric for making comparisons because it’s taken on a regular basis by nationally representative groups of students in grades 4, 8, and 12, and it intentionally sets a high bar for student achievement.
PARCC’s Level 4 cut scores, which mean students are on track to be college ready, are comparable in difficulty to the NAEP proficiency level in mathematics at 4th and 8th grades, the AIR study found. But in English/language arts, PARCC’s “on track to college ready” cut scores fall in the NAEP basic range. (NAEP has three achievement levels in all: basic, proficient, and advanced.)
The chart below summarizes Phillips’ findings for the three tests, and offers the cut scores for each of NAEP’s achievement levels for comparison. PARCC’s test has five scoring levels, with 4 connoting “on track to college readiness.” Smarter Balanced and ACT Aspire have four achievement levels, with 3 signifying “on track to college readiness.”
Looking at other states’ scoring expectations measured up to NAEP standards, Phillips found that states were all over the place—and many set cut scores at levels that are comparable to NAEP’s “below basic” level.
But it’s noteworthy that a handful of states that opted to design or buy their own tests instead of using PARCC or Smarter Balanced, set “on track to be college ready” cut scores that are as tough as NAEP’s proficient standards.
Florida’s assessment, for instance, was found to have expectations comparable to NAEP proficiency in both grades and in both subjects. New York hits the NAEP proficiency benchmark for everything but grade 4 math. Florida’s test was designed by AIR; New York’s was designed by Pearson. Kansas’ test, designed by the University of Kansas, reached NAEP proficiency levels in everything but 4th grade English/language arts.
Phillips said in an interview that of all the cut scores he examined, about 20 percent were set at NAEP’s proficient level.
“What my report shows is that states still have a way to go” if they aspire to align their proficiency standards with those of NAEP, he said.
Phillips steered clear of saying whether states should aim for NAEP-like proficiency on their tests. But he said state assessment leaders have been wanting to know how the various assessments’ expectations compared with NAEP, so he “fast-tracked” his study to make that information available to them, he said.
“That’s what people were asking, so now they know,” Phillips said. “It’s information they can use in deciding where to set their standards, and whether they want to go back and revise them.”
ACT said in an email that it grounded ACT Aspire’s cut scores in its own college readiness benchmarks, the scores on its college-entrance exam that correlate with a good chance of success in entry-level credit-bearing college courses. Speaking for PARCC, New Mexico Secretary of Education Hanna Skandera said in an email that the AIR report “reaffirms that PARCC states have set the higher performance expectations that students deserve and are committed to providing them the tools they need to be prepared for success.”
Tony Alpert, the executive director of Smarter Balanced, said the group’s cut scores were the result of work by more than 2,500 educators who made their decisions “based on a comprehensive review of the content standards and the content of the assessments” and appropriate expectations “for all students.”
The study should not be interpreted as advocating cut scores at the NAEP proficiency level, however, Phillips said. There is no evidence that 4th and 8th grade proficiency on NAEP predict college readiness, he writes in the report.
But he did note that a 2013 validity study showed that fewer than 4 in 10 students who scored close to NAEP’s 12th grade proficiency cut scores have the skills necessary for college-level math or reading.
Marianne Perie, the director of the Center for Educational Testing and Evaluation at the University of Kansas, which designed tests for Kansas and Alaska, and helped them set cut scores, said that increasingly, states are considering proficiency levels of national tests such as the NAEP, the SAT, and the ACT when they set their cut scores.
“What they tend to do is start at the high school level with expectations, and work backwards, so they know they’re being consistent” through the grades, she said.
A flurry of other recent studies have focused on analyzing how tough the PARCC and Smarter Balanced tests are, and how they stack up to other assessments. Harvard researchers analyzed states’ cut scores over time and found they are raising the rigor of their tests.
A pair of studies by the Thomas B. Fordham Institute and the Human Resources Research Organization, or HumRRO, compared the depth and complexity of thinking required on PARCC and Smarter Balanced to that gauged on the ACT Aspire and Massachusetts’ state test, the MCAS, and concluded that PARCC and Smarter Balanced plumb deeper cognitive territory than do the others.
An analysis by Achieve compared proficiency rates on state tests to those on NAEP, and concluded that while many states had raised their cut scores closer to NAEP levels—reporting much lower proficiency rates in the process—many states still had a big “honesty gap,” meaning their tests are much easier to pass than is NAEP.
Get Curriculum Matters delivered to your inbox as soon as new posts are published. Sign up here. Also, for analysis of news and policy about testing.
A version of this news article first appeared in the Curriculum Matters blog.