State Tests Due Fresh Scrutiny as Peer Review Relaunches
Process provides Ed. Dept. powerful tool for oversight
The U.S. Department of Education has relaunched one of the most powerful tools it wields over states' academic standards and assessments: the "peer review" process that had been suspended for three years.
Every state will have to submit to scrutiny, since they all have adopted new standards or tests in the five years since the Common Core State Standards swept the nation.
Under the new guidance unveiled last week, states will have to undergo review in January, March, or May. States will not have to submit their academic standards or tests for review, but they will have to submit evidence—typically boxes and boxes of it—that the standards are rigorous and that the tests are valid, reliable, of high quality, and aligned to those academic expectations.
Compliance with those requirements of the Elementary and Secondary Education Act will be analyzed by panels of experts chosen by the Education Department. The guidance outlines six "critical elements" of review and gives examples of the types of evidence states may submit to prove their standards and tests are up to snuff.
To show that their standards are rigorous, for instance, states can submit various kinds of information, including endorsements by their state university systems that the standards reflect college-ready expectations or documentation that content-matter experts were involved in crafting the standards.
The guidance has been reorganized and expanded from the last version, issued in 2007, to reflect 2014 updates to the testing industry's bible, the Standards for Educational and Psychological Testing. Sections requiring states to submit evidence that their assessment systems include adequate security measures and properly protect student-data privacy, for instance, have new prominence and detail.
Another update in the approach to federal review is an emphasis on ensuring that states' tests seek to measure students' "higher-order thinking skills." The kinds of evidence that states might submit to back that up could include test blueprints, or samples of test-question specifications that lay out the questions' "cognitive complexity."
There are six "critical elements" of states' standards and testing systems that panels of experts will review as part of the U.S. Department of Education's newly relaunched peer-review process. States will have to submit detailed information showing that they have:
• "Coherent and rigorous" academic standards and annual assessments given to all students statewide
• Used sound procedures in design and development to state tests aligned to academic standards, and for test administration and security
• Tests that meet a range of technical specifications for validity
• Tests that meet additional bars of validity, including for fairness, reliability, and accessibility
• Assessment systems that include all students
• Used sound practices in setting and reporting cut scores on tests
Participation data, too, has a new prominence in the revised guidance. States were always required to report the numbers of students who took state-mandated assessments to show that the tests were being given to "all students," as required by law.
But the new version sets off the participation-data requirement by itself, with a grid to illustrate that states must supply the number of students enrolled, and the number and percent of students tested, in each grade, 3-8, and at the high school grade level chosen for testing. Those data take on new resonance, since rising antipathy to testing sparked a massive opt-out movement in some places in 2014-15.
The peer-review process was suspended in December 2012 to enable the department to revise its guidelines in light of many states' new standards and tests and the changes affecting assessment because of technology. States have been impatiently awaiting the new guidance, which was originally slated to come out in summer 2014 but was repeatedly delayed.
One big change in the guidance was sparked by the use of common assessments for the common core. Those states can work together on their peer-review submissions.
That could create an easier road for states that are using tests from either the Partnership for Assessment of Readiness for College and Careers or the Smarter Balanced Assessment Consortium, the two federally funded consortia, especially since states have a relatively tight timeline to get ready for peer review, said Scott Marion, the associate director of the Center for Assessment, which provides technical support to states on assessment.
"I think the six-month timeline for submission is too fast for states," Marion said in an email. "Look at all the analyses required for fairness, comparability, validity, etc. A six-month timeline implies that the state has all necessary analyses conducted, and they just need to package things up. This is a tremendous advantage for the consortia, which is fine with me, but [the department] needs to acknowledge this."
But Wes Bruce, Indiana's former assessment chief, had a different take. "I think there's probably a little bit of truth to the fact that this encourages consortia or collaboration writ large," he said in an interview.
"I don't think it favors any set of consortia," he added. "PARCC and Smarter Balanced aren't more advantaged than if Alaska and Hawaii got together and decided to build" their own test.
Bruce also thinks the time frame is manageable, especially for state assessment officials who have experience with peer review.
Louisiana's former assessment director, Scott Norton, who now oversees standards, accountability, and assessment at the Council of Chief State School Officers, predicted that "states will step up and meet the challenge, but it's a big amount of work for the assessment staff to take on." However, the key new elements make sense, he said, including the new focus on data privacy and test security and the changes for states that are using computer-adaptive tests.
The CCSSO will be supporting states as they transition to the new peer-review process by gathering reactions from state testing directors and hosting a one-day meeting for them in November, with experts on call to help field questions, Norton said. It will also work with the Education Department to resolve problems with the process as states move forward, he said.
Some states used a new test in 2014-15 and plan to switch again in 2015-16, and they worried privately that they'd have to undergo the time-intensive peer-review process for a test they'd already left behind.
Ann Whalen, a special adviser to U.S. Secretary of Education Arne Duncan, said that states would not have to do that. The department would rather have them focus on ensuring that their 2015-16 tests "are high-quality and will sail through peer review," said Whalen, who has been delegated the powers of the assistant secretary for elementary and secondary education.
Vol. 35, Issue 07, Pages 14,16