The Massachusetts Department of Education commissioned Mathematica Policy Research to do the study last year, as it was considering whether to use PARCC in 2017 or keep using its longtime test, the Massachusetts Comprehensive Assessment System, or MCAS. Massachusetts decided to create a hybrid of the two tests. A summary of the study, which compares PARCC and MCAS, was published Tuesday in a peer-reviewed journal, Education Next. (The full study can be downloaded here.)
Researchers at Mathematica wanted to know how closely a “college-ready” score on PARCC and a “proficient” score on MCAS correlate with a good grade-point-average in freshman-year college study, and with the need to take remedial courses. But since it was too soon to follow high school students who took those tests into their freshman year in college—PARCC made its debut in the 2014-15 school year—they devised a “concurrent” design: They had freshmen in Massachusetts state colleges and universities take the PARCC and the MCAS in the spring of 2015 and examined how those scores, from 847 students, correlated with their grades and remediation patterns at the time.
For Massachusetts policymakers, the Mathematica study offered important comparisons, finding that the MCAS had about the same power as PARCC and the SAT college-admissions exam to predict freshman-year grades, and about the same power as PARCC to predict students’ need to take remedial classes. (The SAT wasn’t included in that comparison.) In an interview, Ira Nichols-Barrer, the lead researcher on the study, characterized all the correlations as “modest,” since none was higher than .43.
Nichols-Barrer and his team also found that students who scored proficient on the MCAS were not performing as well in college as those who scored college-ready on PARCC, especially in math. The report said that difference could be resolved by raising the proficiency cut score on MCAS.
PARCC Cut Score ‘Exceeds Its Stated Target’
The Mathematica analysis isn’t the “predictive validity” study that the assessment field is eager to see, since it’s too soon to have completed a study that follows students who took PARCC (or Smarter Balanced) into college. PARCC officials said they knew of no such study so far.
But for policymakers across the country, the Mathematica study offers the first early glimpse into how well PARCC performs as a predictor of college success—at least in Massachusetts. (The sample isn’t nationally representative.) It also offers an early look at how well PARCC did when it set a cut score—level 4 or higher on a 5-level test—that is supposed to indicate college readiness.
The Mathematica researchers found that PARCC has, in Nichols-Barrer’s words, a “modest” power to predict students’ grade-point-average and need for remedial classes. But it when it comes to reflecting the rigor of college work, PARCC “exceeds its stated target,” the study said.
PARCC set out to create a college-ready cut score that would translate into a 75 percent likelihood of earning a C in entry-level college courses. Nichols-Barrer and his team found, however, that students who scored college-ready on PARCC’s English/language arts section had an 89 percent probability of earning at least a C average across all their freshman-year courses, and those who met that threshold on the math section had an 85 percent probability of doing so.
Examining how the students performed in the tested subjects only, the Mathematica researchers found that those who scored Level 4 or higher on PARCC’s English/language arts exam had an average English GPA of 2.76, and those who scored similarly on the math section had an average math GPA of 2.81, the study said. Fifteen percent of the students who scored college-ready on PARCC needed remediation in English, and 12.6 percent needed remediation in math, the study said.
“It’s a strong signal that in terms of that aspect of what PARCC was designed to do—to give a strong indication of college readiness—it succeeded in doing that,” Nichols-Barrer said in an interview.
A study by the Center for Assessment for the Massachusetts Business Alliance for Education, done in February 2015, before PARCC had been administered for the first time, concluded that its progress thus far suggested that a college-ready score on the test would probably serve as a good proxy for college readiness. Scott Marion, one of the authors of that report, faulted the Mathematica study for looking only at PARCC scores’ relation to grades and remediation, and not whether the content of the test includes the right mix of skills and knowledge for postsecondary work.
Study Shortchanges Value of Other Skills?
Researcher David Conley, whose work focuses on the range of skills necessary for college success, such as self-advocacy and time management, said the Mathematica study makes the error of “trying to get too much juice out of one lemon.” Tests that measure only academic skills offer only a partial picture of what makes students successful, he said.
“What the scores don’t tell you is why students with higher scores do better,” Conley wrote in an email. “The faulty assumption we’re left with is that accumulating more math (or English/language arts) knowledge will always result in more college success.”
A more complete picture of students’ college-readiness skills would include a wide range of skills, such as having high but realistic aspirations that involve going on to college, and being able to complete assignments on time, focus in class, seek help when needed, and set and achieve goals, Conley said.
A pair of studies done by two other organizations in February examined another question about the PARCC and Smarter Balanced tests: How well they reflect the content of the Common Core State Standards. They concluded that those two tests reflect the standards—for which they were written—better than the ACT Aspire or the MCAS do.
PARCC spokeswoman Heather Reams said the consortium plans to conduct a longitudinal study over the next two years that will examine “associations between students’ performance on PARCC and outcomes in entry-level college courses.”
Luci Willits, the deputy executive director of Smarter Balanced, said she doesn’t know of a state that has done a study similar to Mathematica’s with the Smarter Balanced test. The consortium is not planning to conduct a predictive validity study across all 14 states that use its test, she said, but several states are planning their own such studies. The first cohort of students who took Smarter Balanced are now seniors in high school, so results of any study that follows them into their freshman year in college wouldn’t be available until at least 2017, she said.
- Testing Group Wrestles with College-Readiness Meaning
- PARCC Approves Test Performance-Level Descriptions by Grade, Subject
Get High School & Beyond posts delivered to your inbox as soon as they’re published. Sign up here. Also, for news and analysis of issues that shape adolescents’ preparation for work and higher education.
A version of this news article first appeared in the High School & Beyond blog.