Only a few years ago, the ambitious initiative to use shared assessments to gauge learning based on the new common-core standards had enlisted 45 states and the District of Columbia. Today, the testing landscape looks much more fragmented, with only 27 of them still planning to use those tests in 2014-15, and the rest opting for other assessments or undecided, an Education Week analysis shows.
For a variety of reasons, including the length and cost of the tests and political heat over the federal money supplied for their development, many states changed their minds about using the end-of-year accountability tests in English/language arts and math from the two state consortia that are leading the drive.
As of mid-May, 17 states planned to use the tests produced by the Smarter Balanced Assessment Consortium; just nine states plus the District of Columbia planned to use tests made by the Partnership for Assessment of Readiness for College and Careers, or PARCC.
Adding a level of complexity, seven states decided to split their assessment regimes, using one test designer for high school and another for lower grades.
Taken together, the shifts have produced a far less unified testing environment than existed even a year ago, when all but a half-dozen states still planned to use one of two assessments by Smarter Balanced or PARCC.
In sharp contrast, Education Week’s analysis shows that states’ current plans for 2014-15 encompass the use of at least 19 different tests.
A look at the membership of the two consortia suggests a more unified picture: Thirty-seven states and the District of Columbia still count themselves as members. But Education Week found that 11 of those, despite their membership, say they will use other tests. Six of the 11 have chosen assessments—largely sticking with their own state tests and adjusting for the new Common Core State Standards—while five are still weighing their options.
For this article, Education Week solicited assessment plans from all 50 states and the District of Columbia, and received responses from all but one state, Indiana. Drawing on those responses, along with Indiana’s publicly reported plans and a variety of political maneuverings in the states, we produced a state-by-state profile for 2014-15.
States are grouped into four categories according to whether they plan to use PARCC, had picked Smarter Balanced, had chosen another test, or hadn’t yet decided.
For opponents of the common-core initiative, the increasing fragmentation is a step in the right direction.
Terry Stoops, the director of research and education policy studies at the John Locke Foundation, a think tank in Raleigh, N.C., said that his own state, a member of Smarter Balanced, illustrates the mismatch between the vision of a common test and the reality.
State leaders in North Carolina balked when the price of the new tests was projected to be twice as expensive as the current ones, Mr. Stoops said. Another set of worries, he said, centered on whether districts, especially in rural areas, were technologically ready to switch to large-scale online testing, which both consortia’s tests demand.
Those problems, combined with skepticism about tests built with federal government money, produced a law that gave the legislature a power previously reserved for the state board of education: the authority to approve the state’s assessments.
“This just goes to show that the idea of the tests doesn’t correspond to the implementation of them,” Mr. Stoops said.
For those who had hoped that PARCC and Smarter Balanced would design better, more instructionally valuable assessments, it’s worrisome to see states backing away from that enterprise.
James W. Pellegrino, a distinguished professor of education at the University of Illinois-Chicago who serves on the technical-advisory committees of both consortia, said the trend “moves us back closer to where we were under No Child Left Behind.” That federal law, though it sought to spur student achievement, left each state free to set its definition of “proficient” as low as it liked, he noted.
With many different tests measuring students’ learning, Mr. Pellegrino said, the country loses the ability to reach a shared, rigorous definition of mastery or college readiness.
In analyzing states’ plans, Education Week found that nearly every state fell easily into the PARCC, Smarter Balanced, “other test,” or “undecided” category because its education department’s response to our inquiry was not contradicted by positions taken by other key officials in the state, such as the governor, education commissioner, state board leaders or lawmakers.
Arkansas, for instance, stayed firmly in the PARCC category because it plans to use that group’s tests in grades 3-11. Ditto for West Virginia, which is sticking with Smarter Balanced in those grades.
Iowa, Kansas, and Kentucky, meanwhile, are three states that are squarely in the “other” category, since they’re using their own new or existing tests.
Uncertainty about which test to use for high school, or the selection of another vendor for that level, didn’t automatically put a state into the “undecided” or “other” category in the Education Week analysis. In the six states that are using one test for grades 3-8 and another for high school, we categorized them according to their choice for grades 3-8, since typically more accountability tests are collectively given in those grades than in high school.
Accordingly, we placed Oklahoma in the “other” column because it has chosen Measured Progress, a Dover, N.H.-based test developer, to build assessments for grades 3-8, but is undecided at the high school level.
Rhode Island, on the other hand, stayed in the PARCC column because it is using that consortium’s tests for grades 3-8, but retaining the New England Common Assessment Program, or NECAP, for high school.
Three other states—Missouri, Nevada, and Wisconsin—are using Smarter Balanced tests in grades 3-8 and other tests for high school.
Sizing Up States’ Plans
In considering whether to classify a state’s assessment plans as “undecided,” we took into account three key dynamics: the progress of legislation that ruled out the use of consortium tests; a major dispute about the use of consortium tests among a state’s top officials, such as its governor, commissioner of education, and state board president; and the issuance of a request for proposals for state assessments.
Some states that fell into the “undecided” category have leadership that strongly supports the common core, but political turmoil has led them to delay or waver about deciding to use consortium tests.
New York, for instance, belongs to PARCC, but anti-common-core sentiment there has led the state to hang back on a commitment to use PARCC assessments in 2014-15.
“The Board of Regents has yet to decide on the utility of these exams for New York state students,” state education department spokesman Tom Dunn wrote in an email to Education Week.
Massachusetts, whose commissioner of education is the chairman of the PARCC governing board, is letting districts decide in 2014-15 whether to use PARCC or the state’s current test, the Massachusetts Comprehensive Assessment System, or MCAS (though 10th grade students must still take the MCAS next year to graduate).
Categorizing Louisiana offered the trickiest terrain. State education department officials told us their plan was to use PARCC tests in 2014-15, and state Superintendent John White and a majority of the board of education support PARCC. The state still belongs to that consortium.
But most other indicators in the state tilt heavily against the use of those tests. Gov. Bobby Jindal, a Republican, appears strongly inclined to dump PARCC, and might well have the upper hand in doing so. The state House of Representatives has also passed a budget amendment barring use of consortium tests unless the state solicits other bids.
While winds could shift in PARCC’s favor in Louisiana, for now Education Week deems its plans “undecided.”
Michigan was in the Smarter Balanced category until early May, when both houses of its legislature approved language calling for a new version of its own test next year.
A version of this article appeared in the May 21, 2014 edition of Education Week as State Plans For Testing Fragmented