Test Of Wills
When huge numbers of students failed rigorous new tests in Virginia, alarms sounded throughout the state. Though many educators described January's results from the new statewide exams as only a starting point, others blasted both the tests and the passing scores set for them. "Something is wrong here, and I would have to say that it is not our schools, not our hard-working principals and teachers, and certainly not our students," said Daniel Domenech, superintendent of the 154,000-student Fairfax County schools, the largest district in the state and one of the best.
Such public outcry has been heard throughout the country in recent years, as an increasing number of states introduce tough new standards and tests for students and schools. Almost by definition, such first-year test results are going to be bleak, the inevitable result of refocusing large numbers of schools and classrooms on new, sometimes controversial goals. And whether states prepare parents, community leaders, and educators for such massive change often determines whether the public continues supporting the schools.
"You can't go from telling schools and kids they're average or above to telling them they're all failures and not expect the public to be, at the very least, confused," says Andy Plattner, chairman of A-Plus Communications, an education consulting firm based in Arlington, Virginia.
Several states that have experienced post-test turmoil in the past few years have managed to retain parent and educator support. Most attribute that success to aggressive public-relations strategies.
Rule No. 1, experts say, is to start early and let the public know what to expect before the test results are reported.
"The time to respond to this crisis isn't the day the test scores come out," says William Porter, executive director of the Washington State Partnership for Learning, a Seattle-based business coalition. Washington business leaders and the state superintendent met with editors at newspapers throughout the summer before the results were released, says Porter.
Massachusetts education officials acted preemptively, as well. They held press briefings across the state to help reporters understand the new tests and how the scores would be reported, and they invited some 3,500 local educators--superintendents, principals, and teachers--to special workshops.
Then, two months before the results were released, more than 1.4 million copies of a brochure explaining the new standards and tests were distributed to schools statewide. Mass Insight, the nonprofit group that produced the brochures, also formed alliances of business, political, and education leaders to reinforce the message that expectations and achievement needed to be raised. The goal was to send a simple, coordinated message: First-year results were just a starting point from which the state could move forward.
Experts say the second rule for managing the fallout from new exams is to make the testing program as open and transparent as possible. Delaware, for example, hosted a "Take the Test" day this past December. The state's two major newspapers carried an insert that included sample test items, and stores and restaurants set up tables where people could try the questions. In 1997, customers at McDonald's restaurants in Washington state could answer three sample questions printed on tray liners that came with every meal.
Though Massachusetts was criticized last year for shrouding its teacher-competency test in secrecy, it adopted an open-door policy for its student tests. "Nothing should be mysterious," says Alan Safran, chief of staff for the Massachusetts Department of Education. "That erodes public confidence." The state has taken the unusual step of releasing all its student test questions on the World Wide Web after the exams are given, along with examples of work that met the standards.
Of course, such a move isn't cheap. "We have additional costs because we have to develop new questions every year, but we feel that it's money well spent," says Jeff Nellhaus, state director of standards in Massachusetts. "Parents and teachers both have a much better understanding of what the questions are."
But not everyone agrees with that strategy because it makes year-to-year comparisons harder. "It undermines the statistical merits to release an entire test every year," argues John Tanner, director of assessment and analysis for the Delaware education department. "We release up to a third of the items off of each administration."
Some observers believe that it's a mistake to shoot for the moon right off, that states should phase in high standards by raising the bar over time and avoid the shock of awful first-year results.
"If you don't give the public a transition that they see as fair, they'll essentially just revolt," Andy Plattner warns. This strategy seems to have worked for Texas. Although the state has been criticized for setting the bar too low, its testing program has won widespread public support.
Many states, though, have rejected the incremental approach, opting instead to set high passing scores based on what they want students to learn in the future rather than what most students know now. When Virginia followed such a strategy, the percentage of students who passed on the first go-around ranged from a low of 32.8 percent in 5th grade history to a high of 71.1 percent in 8th grade science. The average passing rate across all the tests was 41 percent.
But the student pass rates in Virginia have received far less attention than the rates for individual schools, which will lose their state accreditation beginning in 2007 unless 70 percent of their students pass four key subject tests. This year, only 39 of the state's 1,800 schools, or 2.2 percent, met the performance goals. Another 128 would have with a better performance in just one subject area. "Now everyone is focused on the 98 percent failure rate," says state school board President Kirk Schroder. "But people don't understand that that figure represents a goal that is set for the year 2007."
Several Virginia superintendents believe the test results can be improved. "Our students are capable of learning the material that's addressed in these standards," says superintendent Deanna Gordon of the 14,000-student Roanoke system, where only one of 39 schools met the standards. "We have tried to insist to parents that their initial response should not be to attack the tests but to agree to work with us to try to help students pass."
But others contend that the state should rethink the process. "The state insisted on testing first, training teachers second, and purchasing new books and teaching materials third, which is the exact opposite of what you need to do," says Frank Barham, executive director of the Virginia School Boards Association. "I don't think it's a reflection of what our kids know or don't know, as much as the state getting the process backward."
State education officials already have made several changes in the testing program, agreeing, for example, not to report the first-year results on students' high school transcripts. But whether those steps will be enough to retain public confidence in the new standards and exams remains to be seen. Stanley Rabinowitz, a testing expert at WestEd, a federally funded research center in San Francisco, questions whether any system could survive a failure rate as high as Virginia's.
"[But] you can't have it both ways," he adds. "You can't have the highest standards in the country and think that they're going to be painless."
Vol. 10, Issue 6, Pages 10-11