Early Reports Suggest Few Field-Testing Snags
Exams aligned with the common core are seen as harder than state assessments
Field-testing of two multistate online assessments is going more smoothly than many educators had expected, despite technological glitches in the coast-to-coast experiment. And even though the exams, aligned with the Common Core State Standards, are still in the tryout phase, they are proving tougher than the ones students are used to taking.
Those are the two major themes that emerged from an Education Week reporting project that examined the experiences so far of a collection of school districts across nearly half the states with the trial run of tests designed by the Smarter Balanced Assessment Consortium and the Partnership for Assessment of Readiness for College and Careers, or PARCC.
The field-testing, designed to find problems with the assessments before their designs become final, began in late March and runs through early June. Participation varies widely; a few states are giving the exams to nearly all students, while in most places, students in only some classrooms, at some grade levels, are involved.
Although potential technological problems with the move to large-scale online testing have topped educators' list of concerns, district and school leaders reached for this article, with only one exception, reported relatively minor problems.
"We had our reservations going in, but it's not the nightmare that some people predicted it would be. So far, there have been no big problems," said Paul Richter, the director of assessment in Nevada's Washoe County district, which is field-testing the Smarter Balanced exams.
The reporting conducted in late April by Education Week is not based on a nationally representative sample, but rather reflects a snapshot of experiences drawn from teachers, principals, and district assessment and curriculum officials in 29 school districts in 24 of the 36 states participating in the field-testing.
It shows key themes that are likely to shape the tests' final designs and inform education leaders as they make decisions about which tests they'll use to gauge learning under the common core. The new standards, in English/language arts and mathematics, were unveiled in 2010 and are being put into practice in all but five states.
Not surprisingly, familiarity with online testing tended to make the field-testing run more smoothly.
The Coeur D'Alene district in Idaho had built up a good infrastructure because its current state exam, the Idaho Standards Achievement Test, or ISAT, is computer-based, said Mike Nelson, the district's director of curriculum and assessment, who said the Smarter Balanced field-testing went well.
Even so, there were hitches. Computer audio functions for the ISAT conflicted with the audio needed for the field tests, so the computers had to be restarted, Mr. Nelson said.
Delaware also already gives its tests online, so students are accustomed to that approach. But some struggled as they learned to use the Smarter Balanced test's new tools, such as highlighting and on-screen calculators, said Travis Moorman, the director of teaching and learning for the Milford school system.
From Paper to Computer
Districts less familiar with online testing had a steeper learning curve.
Vermont, which has been giving the paper-and-pencil New England Common Assessment Program, or NECAP, plans to move to Smarter Balanced, and the Blue Mountain Union School, a K-12 building of 445 students in rural Wells River, 70 miles south of the Canadian border, had a rough-and-tumble taste of the difference.
Principal Emilie Knisley said that when members of her staff tried to log on to the assessment, their screens repeatedly displayed messages that the site was down for maintenance. Students who managed to start the test often found their sessions abruptly terminated. As a result, 40 percent or more of her students had to repeat the test another day, Ms. Knisley said. Those issues did improve, though, as the days passed, she said.
Administrators in many districts reported that many students, particularly those from homes without computers, were not prepared for the level of technological savvy the tests demanded.
"Our students don't necessarily have a lot of experience navigating different frames or highlighting things on the computer, so this has been an eye-opening experience for our teachers, to see what we need to be preparing students for," said Deborah Warr, the principal of Knollwood Heights Elementary School in Rapid City, S.D., where nine in 10 students qualify for subsidized meals. "Starting in kindergarten, we need to do a lot more work just navigating the computer."
Those kinds of challenges raised big questions for David Estrop, the superintendent of the Springfield, Ohio, system, which is field-testing the PARCC exams. He said his students struggled less with the tests' content than with the technological features, such as clicking and dragging items on the computer screen. He worries about next year, when the tests will be given in final form.
"Are we testing [students'] knowledge and skill of the content, or their knowledge and skill in using the device?" he said.
Many students apparently enjoyed the online experience.
It "took a little time to get used to taking the test on computers, [but] students liked being able to scroll up and down during the test and see the materials side by side," said Kay Dugan, the assistant superintendent for learning in Bensenville School District 2, a K-8 district in Illinois.
Officials in the Reeths-Puffer district in Muskegon, Mich., surveyed their high school students after the field-testing, and they reported enjoying its accessibility tools, such as highlighting and striking out text, said Terri Portice, the director of teaching and learning.
Many districts have been in heavy preparation mode for the field-testing with students, teachers, and technology, and reported that such measures paid off.
In the Millville, N.J., district, which serves a large share of students living in poverty, students used PARCC's practice tests, and children as young as kindergarten have been working on their typing skills, said Harry Drew, the principal of R.D. Wood Elementary School.
The Springfield City Schools in Ohio spent several days training teachers in test-administration procedures, Mr. Estrop said. That effort, combined with a $3 million investment in wired and wireless technology, minimized problems with the tests, he said.
The Pulaski County Special School District in Little Rock, Ark., credited a bandwidth upgrade and lots of upfront technology work with an uneventful PARCC field-testing experience. Chief Technology Officer Will Reid said his staff loaded computers with Java and other software updates and tested every computer's functionality ahead of time.
Districts that didn't think ahead about software updates reported problems. Computers at Cibola High School in Albuquerque, N.M., are set to update automatically during the school day, said Ryan Kettler, the assistant principal for 11th grade, so "that would come up and boot the kids off."
Mr. Reid, from Pulaski County, said the field-testing "went much smoother than we anticipated from a technological perspective." But the trial run involved a very small number of the district's students, and he thinks it will be "an extreme challenge" when the test goes districtwide next year.
"It takes such a coordinated effort between IT, teachers, administrators, and students for this to go off successfully," he said.
The most common problems at call centers staffed by PARCC and Smarter Balanced involved difficulty logging on to the system, especially when administrators had forgotten their passwords, officials of the testing consortia said.
The consortia also got feedback that some of the instructions for administrators, including instructions they read aloud to students, were confusing or too lengthy, so those will be revised, spokeswomen from both groups said.
It's too soon, they said, for feedback on how well the test items themselves worked. An analysis that will focus on specific questions, or types of questions, that caused problems will begin when field-testing concludes, the spokeswomen said.
Computer-system capacity was an issue for some schools and districts. Connecticut, one of a few states that are involving nearly all their students in field-testing, saw more than its share of slowdowns and access issues because so many students sat for the test right at the beginning of the testing window, said Smarter Balanced spokeswoman Jacqueline King.
"They took the lion's share of Day One growing pains," but those problems eased as the days went by, she said.
That wasn't exactly the experience in New Jersey's Millville district, where R.D. Wood Elementary School saw problems worsen as more schools around the state joined the field-testing. Principal Drew said that "could cause problems down the road" when the real PARCC tests come online.
Based on field-testing experiences, computer hardware looms as a potential problem in operational testing as well.
The schools in central Wisconsin's Weyauwega-Fremont district rotated 400 students through the desktops in their computer labs, but officials there said they see adding another computer lab as crucial for a successful administration of the real test next year.
Educators have worried that the new tests will be harder for students, and that concern appeared to be well founded. Most teachers and administrators reported that their students did find the tests more difficult than their states' current tests."The quote I heard over and over again was, 'Wow, that was hard,' " said Scott Moran, the director of secondary school improvement for the Denison district in western Iowa, which administered the Smarter Balanced exam to 3rd and 5th graders.
In Delaware, the Milford district surveyed students and teachers, who reported that the Smarter Balanced assessments were "a complete and total shift in thinking, teaching, and assessment," said Mr. Moorman. The longer, more complicated performance tasks, however—the piece teachers were most worried about—were what drew the warmest comments from students.
"They enjoyed those the most because they were taking academics and applying them to current situations, things they were connected with," Mr. Moorman said.
The performance tasks proved tough for many test-takers in Michigan's Reeths-Puffer district. Their performance on those items showed district leaders that more classroom work is needed on "that [kind of] deep-level, multiple-step, multiple-day project," Ms. Portice said. Some high school students, surveyed after the field tests, said they missed the state's current test "because it was easier," she said.
Students in various places complained of having to read long text passages for the English/language arts portion of the tests, and educators reported that many took a long time to finish.
It took some students 90 minutes to complete the four required essays, said Kandi Martin, the curriculum director in Wisconsin's Weyauwega-Fremont district.
Vicky Lynch, who supervises accountability and testing in the Bossier Parish in Benton, La., said teachers and students reported that the test questions were difficult. Some of that difficulty might have been attributable to the online testing experience, or lack of familiarity with PARCC-type items, she said, but some was due to the rigor of the questions.
"They had expected the different format," she said, "but even some of my high-performing kids would call the teacher over and say, 'This is confusing.' They definitely thought the questions were difficult."
Children in Vermont's Blue Mountain Union School told Principal Knisley that they wrestled with the math portion of the test because it made them blend skills together to solve a problem, instead of regurgitating facts.
"Our math curriculum is 20 years old; you learn multiplication and you practice it," she said.
How well the online accommodations would work was another area of worry for school personnel. Midway through field-testing, they reported a variety of good—and not so good—experiences with them.
Officials in Iowa's Denison district found that the text-to-speech function on the Smarter Balanced test didn't always work smoothly, said Mr. Moran.
In Ohio, educators in the Springfield system couldn't figure out how to make that tool work, and discovered that there were accommodations tools they didn't use because they didn't know about them, said Crystal Aker, the district's coordinator of testing, accountability, and research.
School leaders in Colorado's Adams 12 Five Star Schools were disappointed to find they couldn't test all students together, said David Bahna, the director of assessment and accountability. The district had to create specific testing sessions for students with special needs, he said.
Other districts, meanwhile, found that those accommodations could be loaded onto computers ahead of time so students who needed them did not have to be tested separately.
"To have your test customized for you is huge," said Ms. Portice of the Reeths-Puffer district in Michigan.
Officials in the Hartford, Conn., system noted that it was easy to overlook that important step, so administrators had to be mindful to preload those accommodations, said Michelle Puhlick, the executive director of curriculum and instruction.
Rob Watson, the superintendent of the Bozeman, Mont., schools, said he had worried that his students might have trouble manipulating various tools on the computer screen; that turned out not to be an issue at all. He theorized that time spent on practice tests, and on showing children how to navigate back and forth between windows, paid off.
The accommodations have "tremendous" promise for students with disabilities, said Mr. Richter of Nevada's Washoe County system. He singled out the read-aloud tool, which allows all children to hear questions read in the exact same way, minimizing variation from reader to reader.
A number of district leaders said that a hefty dose of staff training was important in preparing for the field-testing. The Milford schools in Delaware, for instance, got a grant to train staff members with online modules outside the school day. But many administrators said a particular challenge was finding the time for such sessions.
"Training, training, and training helped to alleviate the panic and frustration," said Razak Garoui, the executive director of accountability, research, and assessment for the Kent district in Washington state. "The first day, [the] second day, people are panicking, but as soon as they start the test, everything goes smoothly."
There have been few, if any, reports so far of what would be a significant hurdle in common-core testing—that the exams don't faithfully reflect the standards on which they're based.
"Teachers felt the [PARCC] test was aligned to what the common-core standards are," said Ms. Dugan, of Illinois' Bensenville District 2, "and they felt good about that."
Vol. 33, Issue 30, Pages 1,20-21
Get 10 free stories, e-newsletters, and more!
- Chief Academic Officer
- Cristo Rey Network, Chicago, IL
- Director of Auxiliary Programming
- Lovett School, Atlanta, GA
- Head of School
- Augusta Preparatory Day School, Martinez, GA
- Director of College Counseling
- Augusta Preparatory Day School, Martinez, GA
- Director of Technology
- St. Paul's School for Girls, Brooklandville, MD