For four years, schools in nearly every state have been working to put the Common Core State Standards into practice in classrooms, but few have put them to the test—literally. This year, that changes.
The 2014-15 academic year is when nearly every state must have assessments in place to reflect the common core, or other “college- and career-ready” standards they have adopted. And unlike last year, when many states were allowed to cut back on their regular tests because they were field-testing new assessments, this year’s achievement results will be a cornerstone of states’ public accountability reporting.
The specter of college- and career-ready assessments has loomed large in education leaders’ minds for several years, since it comes with a volatile mix of novelty and risk.
Schools will be held responsible for how well they’ve imparted the new standards, even as skills such as reading complex text and demonstrating mathematical reasoning are new to many students, and as teachers are still figuring out how best to teach them. States face big drops in proficiency rates if the new tests are, as expected, tougher than the previous ones.
Even as educators steel themselves for those results, questions swirl about how well the tests will measure the standards they’re based on, and the skills educators value most.
Two dynamics further complicate the question of assessment.
Some states have moved to choose new tests after backing out of shared assessments or reversing their common-core adoptions.
And national uneasiness about the time and money spent on standardized testing, and about the decisions based on it, is increasing.
“We’re hoping that districts in Illinois, and everywhere, wake up to the problem with all the state testing, because this year it’s coming home to them as it never has before. It’s craziness,” said Cassie Creswell, who is helping organize opposition to testing in Chicago, where her 3rd grader attends public school.
Alignment Concerns
With next spring’s testing season already on the horizon, measurement experts worry that many states risk giving assessments that don’t fully reflect their academic standards. Allowing only months for a new or tweaked test—or using an existing test for new standards—erodes the likelihood of good alignment, they warn.
“When you’ve developed a test with one goal in mind, and that target is changed, you’ll have a misalignment between assessment and instruction, and that’s not good for anybody,” said Stephen G. Sireci, the director of the Center for Educational Assessment at the University of Massachusetts Amherst.
State education leaders are all too aware of that uncomfortable truth as they transition to new tests. Iowa adopted the common standards—which cover English/language arts and mathematics—but chose to keep using its own state test this year.
Brad Buck, the director of K-12 education in Iowa, said that a recent study showed “weak to limited alignment” between the Iowa assessments and the common core, a claim the test’s maker disputes.
“We recognize we’re in that middle ground between the assessments and the new standards, and it’s not an easy place to be,” Mr. Buck said.
Some vendors, from the start, have been tracking the common-core initiative and work by the two major state assessment consortia developing aligned assessments, so they might be able to offer tests that capture the standards relatively faithfully, said Gregory J. Cizek, a professor of educational measurement and evaluation at the University of North Carolina at Chapel Hill, but it’s no sure thing.
“You could buy an off-the-shelf test, and it could be terrible because it’s not aligned [to the standards], or it might be aligned. We just don’t know yet,” he said. “At the same time, you can’t measure the common-core standards without asking kids to construct arguments and cite evidence. And you can’t do that with just multiple-choice questions.”
A state would be well advised to demand research that shows the testmaker has performed an in-depth study detailing the degree of alignment between the state’s standards and the test it’s considering buying, Mr. Cizek said.
“A vendor should have—and the state should demand—evidence of alignment,” he said. “When a vendor says, ‘Here is a test that measures the common core,’ a state’s first question should be, ‘What is your evidence?’ ”
It takes many months of development to design a test that uses performance tasks, evidence-based essays, and multistep math problems to probe students’ skills in a nuanced way, experts say. States that buy new tests or quickly revise current ones, they warn, are likely to end up with assessments similar to the multiple-choice exams that have been so heavily criticized in the wake of the federal No Child Left Behind Act and its emphasis on test-based accountability.
“The closer a state is to scrapping the consortia tests and doing something quickly, the more likely those tests are to be closer to that end of the continuum,” said Derek Briggs, a professor of research and evaluation methodology at the University of Colorado-Boulder.
The two main groups of states that used federal money to design common-core tests, the Smarter Balanced Assessment Consortium and the Partnership for Assessment of Readiness for College and Careers, or PARCC, have worked for nearly four years on their tests. But measurement experts say the question of how well those tests are aligned with the common core won’t be settled until research on that issue is completed. Both groups plan to complete such studies in the coming months.
PARCC spokesman David Connerty-Marin noted that not all the alignment work has been left until the end of the test development.
“There’s a significant difference between this test and tests states typically purchase,” because the PARCC test was “built from scratch” for the common core, he said in an email. In addition, he said, the consortium enlisted teachers and other educators to review items for alignment as they were written.
As a result, “there should be a lot less mystery” in determining PARCC’s alignment to the common core than there would be when states try to determine alignment of their own tests, or an off-the-shelf test, to those standards, he said.
Joe Willhoft, the executive director of Smarter Balanced, said that once its alignment studies are complete, the consortium will have time to revise parts of the test that are not well enough aligned before administering it in the spring. Even still, alignment is best thought of as ongoing, he said.
“Without a doubt, there are some aspects of this alignment work that will be tighter than others,” Mr. Willhoft said. “It’s never something that’s all done, it’s perfect, and we don’t have any more work to do. Each year, we expect to be refining it.”
The Testing Timetable
The obligation to administer college-readiness tests has its roots in two key places, both linked to the U.S. Department of Education: the work by PARCC and Smarter Balanced, and the waivers the department offered to excuse states from key requirements of the No Child Left Behind law. States that belong to one of the consortia or have an NCLB waiver had to commit to having tests of rigorous standards in place by 2014-15.
The consortia work dates back to 2010, when the Education Department awarded $360 million in grants to the two groups to make tests tied to the common standards that emerged that year from a project led by the National Governors Association and the Council of Chief State School Officers. Wielding voting power in either assessment consortium—which nearly every member state chose to do—required states to administer those tests in 2014-15.
“This is the year that the consortia are expected to deliver on the things they promised,” said Mr. Sireci. “This is the year we are going to see what the quality of those assessments might be, and what kinds of results we’ll get back.”
But in the past year, the standards and testing landscape has seen big shifts. Three states—Indiana, Oklahoma, and South Carolina—reversed their common-core adoptions. They have had to hurry to craft new standards—some of which draw heavily on the common core—and revise or buy tests. States that kept the common standards but opted not to use PARCC or Smarter Balanced, such as Florida, Iowa, and Kansas, are revising their current assessments, or engaging vendors to design new ones, as the clock ticks down to spring testing time.
In 2009, every state was using its own test. Within two years, all but five had agreed to use shared exams being produced by the consortia. But 11 more have jettisoned those tests amid mounting worries about time, cost, and political flammability—the tests were increasingly seen in some quarters as federal overreach into local school affairs.
It’s not only those 16 states that plan to go their own way with tests next spring, however. An Education Week analysis showed that 24 states—some still members of PARCC or Smarter Balanced—have chosen to use another test, or are still deciding what to use.
Even though consortium rules required “governing” states to use the resulting shared tests, and likely prolonged many states’ commitments to do so, there were no consequences attached to changes of heart—unless, that is, a change of heart occurred in a state that had secured a No Child Left Behind waiver.
First offered by the Education Department in 2011, those waivers have been awarded to all but seven states (and withdrawn from two, Washington state and Oklahoma). In exchange for leaving behind some of the more onerous requirements of the federal law, waiver states must give “high-quality assessments” of “college- and career-ready standards” in English/language arts and math in 2014-15. States could meet that requirement by promising to use Smarter Balanced or PARCC tests.
Nearly every state that made such a promise in its waiver application and then changed its mind found a letter in its mailbox from the federal department asking for detailed testing plans in lieu of the consortia exams.
No state has yet lost or been denied a waiver because its testing plans didn’t meet federal muster (although one state, Oklahoma, did lose its waiver after reversing its common-core adoption). Typically, when a waiver state has decided against consortium tests, the federal department has sent a letter of inquiry, asking the state to detail its plans to “administer annual, statewide, high-quality assessments aligned with college- and career-ready standards.”
According to the department, 13 states have received such letters; several others didn’t because they had included satisfactory testing plans in their applications. It’s clear from the department’s decisions that it is willing to grant flexibility to states that have chosen non-consortium tests.
Securing Approval
Alabama, Kentucky, and Virginia, for instance, never got those inquiry letters because their plans to use non-consortium tests were detailed in their original waiver applications. Kentucky is using Pearson-designed common-core tests in grades 3-8 and a suite of ACT tests in high school. Virginia is using its own Standards of Learning tests. Alabama chose the ACT Aspire suite of tests, which it used in 2014 and plans to use again in 2015.
Michigan’s legislature barred that state from using the Smarter Balanced exams this school year, but it still got a waiver extension after explaining to the Education Department that it will modify its current test. Georgia got an inquiry letter when it dropped out of PARCC; the state submitted an 85-page description of its testing plan, and its extension was approved in July.
Tennessee received a letter from the federal department after its legislature approved a law requiring the state to use its current test, the TCAP, in 2014-15. The state submitted details of its plans to align that test with the common core, and it is awaiting a federal decision.
States are feeling intense pressure as they seek approval of their testing plans from the Education Department. Officials from several states contacted for this article refused to discuss the matter, citing political controversy over testing and the fear that their waiver extensions might not be approved.
“It’s crazier than ever,” one state official said. “No one around here will be willing to talk to you about this.”
States are also mindful of another source of testing pressure coming down the pike: new federal guidelines for peer review of their standards and tests. Those criteria, due out soon, shape how experts chosen by the Education Department decide whether states’ standards and assessments meet requirements in federal law.