|Aspiring teachers in Massachusetts became the butt of jokes when more than half failed a new series of tests. But many are wondering if the tests themselves measure up.|
When Sandy Nager arrived at the Cambridge Rindge and Latin School this fall to take the Massachusetts teacher tests, the place was swarming with television cameras. Reporters buttonholed test-takers as they entered the building. Critics of the tests distributed fliers. A 58-year-old Fulbright scholar gave interviews before heading inside to retake the reading exam, which she had failed earlier.
Nager, 27, squeezed into a high school desk with a small writing surface and squinted against the sun pouring through the windows. She was determined to fight off her nerves and do her best. Above all, she wanted to avoid the humiliation that has befallen so many others since April, when 59 percent of the prospective teachers who took the exam failed. In order to get a license in the state, teacher-candidates must pass a communication and literacy exam, which evaluates reading and writing ability, and a subject test in their field.
The poor performance of aspiring Massachusetts teachers quickly became a national joke--fodder for Jay Leno, scolding newspaper columnists, and critics of public schools. What is less well-known is the story of the tests' tumultuous birth, a tale in which the friction between politics and public policy ignited a firestorm that has burned the Bay State's entire education community.
Perhaps it wasn't surprising that a teacher test would become a big story in Massachusetts, a pioneer in American public education and the home of dozens of distinguished colleges and universities. The test results cast the commonwealth's 62 teacher-preparation programs in a harsh light, exposing alarming failure rates at some.
What is more remarkable is the staying power of the test controversy. So far, it has claimed one interim education commissioner, provided a high-profile issue for a gubernatorial election, influenced state teacher policy, drawn threats of a class action, and sparked a backlash among academics who believe the tests themselves and their initial administration were flawed.
Candidates for licenses were told that their scores on the first two rounds of testing wouldn't count, only to learn otherwise days before the tests were given. There was no study guide until this fall. And the state school board changed its mind in setting the qualifying scores, creating public doubt and confusion.
Not everyone, however, believes that the results are unfortunate. John R. Silber, the chairman of the Massachusetts school board and the chancellor of Boston University, says the teacher test has had the "salutary effect" of exposing the "derisory" standards that pervade education schools.
The reading and writing test, he asserts, "was an examination that a high school graduate ought to be able to pass. The idea that a college graduate can't pass it means that the college degree is fraudulent."
Others fear, though, that the negative publicity and public ridicule of aspiring teachers--Speaker of the House Thomas M. Finneran, a Democrat, called those who flunked the tests "idiots"--has damaged the teaching profession.
"I'd think twice if I were a teacher-candidate in a school of education and I wanted to come to Massachusetts and take that test," says Robert V. Antonucci, who served as education commissioner until March, a month before the exam was first administered. "And that's wrong."
The Massachusetts Educator Certification Tests were a long time in coming.
In 1985, the state enacted a law requiring testing for teacher licensing--a measure that was promptly ignored. Teacher tests were required again in 1993, in the commonwealth's Education Reform Act, a $3.25 billion plan that set rigorous academic standards and required new tests for students. (The results of the first round of the Massachusetts Comprehensive Assessment System came out last month.)
|Since the exams made their debut at a time of intense national concern over teacher quality, the results helped to focus public attention on teachers' scores.|
But it was only after the current board was appointed by the governor in 1996 that the state got serious. National Evaluation Systems of Amherst, Mass., won a contract in October of last year to custom-design teacher tests. The result: a four-hour communication and literacy exam, plus a four-hour test in the subject the candidate wants to teach. There are more than 30 tests in all.
The tests would make Massachusetts the 44th state to screen prospective teachers. Because such exams made their debut at a time of intense national concern over teacher quality, the results helped to focus public attention on teachers' scores. They also turned up the pressure on education schools nationwide to see that their students pass the tests.
Many states use exams designed by the Educational Testing Service of Princeton, N.J. But California, Colorado, Illinois, Michigan, New York, Oklahoma, and Texas, like Massachusetts, administer tests developed by National Evaluation Systems.
The company had six months to create the tests for Massachusetts before the first administration in April. In part, the test developers used a library of questions given to teachers in other states.
"We were on a fast track," Antonucci recalls. "We were really pushing this."
The former commissioner, who resigned Feb. 28 to become the president and chief executive officer of ICS Learning Systems in Scranton, Pa., says National Evaluation Systems officials told him they wanted to use the April and July rounds of testing to validate the tests. Under that scenario, prospective teachers would receive scores that districts could check before hiring them, but the state would wait to set its standards for licensure until the kinks were worked out. Such a process isn't unusual when new tests are introduced.
In January, the Massachusetts Department of Education informed candidates that merely taking--not passing--the first two rounds of the exam would satisfy the testing requirement and allow them to become licensed. While Antonucci was concerned about the arrangement, he deferred to the company's expertise, he says.
But Antonucci's interim successor, Frank W. Haydu III, an investor and former member of the state school board, worried that allowing all candidates to pass the test would send the wrong message.
"I said, 'Listen, you can't just sign your name or take the test and show you're illiterate and have us allow you to be in the classroom,' " Haydu recalls.
So on March 25, just before the April 4 testing date, the education department notified registered test-takers that their scores would count.
The about-face, coupled with the lack of a study guide, left many candidates feeling unprepared. At Salem State College, just 41.8 percent of 95 prospective teachers passed the entire exam in July. Clarke Fowler, an assistant professor of education there, complains: "They did their best to see that my students would do their worst."
Robert Schaeffer, the public education director at the Cambridge, Mass.-based Center for Fair & Open Testing, or FairTest, says legal precedent has established that candidates must have adequate notice to prepare for high-stakes tests, such as a licensing exam that determines whether they can teach. FairTest is putting together a class action to challenge the test.
But Silber and Edwin Delattre, the dean of Boston University's school of education and a member of the state board, argue that education schools and their students knew for years that the tests were coming and would "count." Silber calls the information indicating otherwise in the January question-and-answer booklet "treason by the clerks" who tried to bypass the board by putting out false statements.
"There were people who felt as if the policy had been hither and yon," Delattre says, "when in fact, it had never been hither and yon. By the lights of the board, it had been straightforward since November 1996."
|To some critics, the content of the test is as troubling as the administration of it.|
To some critics, the content of the test is as troubling as the administration of it. Particularly odd is the fact that the communication and literacy test includes dictation. Stranger still was the passage used on the April test: a selection from the Federalist Papers. Prospective teachers had to listen to the 18th-century prose and write what they heard--a test of their spelling, punctuation, and capitalization skills.
The Massachusetts teacher exam is the only one Schaeffer has seen in some 13 years of advocacy on testing that included dictation. He ridicules the selection as "brilliant and fresh in the 1780s" and accuses Silber and Delattre of violating testing practices by suggesting the Federalist Papers be used.
But both board members say they gave the testing company a number of alternatives, which Delattre says included portions of Life and Death in Shanghai, The Life of Helen Keller, and Letter From Birmingham Jail.
Furthermore, Silber says, Massachusetts students are required to read the Federalist Papers to graduate from high school.
The board chairman ridiculed candidates' spelling on the dictation portion of the exam in a July opinion piece in The New York Times, further infuriating critics of the tests.
"How could educated people fail to copy what they heard?" Silber wrote. "It wasn't easy, but scores of applicants managed, recording broken sentences and curious new spellings such as 'improbally,' 'corupt,' 'integraty,' 'bouth' (meaning both), 'bodyes,' and 'relif.' "
The dictation portion of the test was the most visible flashpoint of the broader controversy over whether the test actually measured what teachers knew. Critics of the testing program--and they are legion in Massachusetts' higher education community--argue that the exam was not properly validated.
The tests were vetted by more than 5,000 educators to make sure they matched the commonwealth's objectives, says Dominic Slowey, a spokesman for the testing company. National Evaluation Systems "doesn't give invalid tests," he says. The company plans to publish a full technical report next month.
The state education department also "stands behind the tests' reliability and validity," says Alan Safran, the agency's chief of staff. The department has held briefings for teacher-trainers to share test questions and answers, he notes, although some professors and deans still complain that they haven't gotten enough feedback on their students' performance to be helpful.
Walter Haney, a testing expert at Boston College's respected Center for the Study of Testing, says professional standards call for the test developer to produce a technical manual documenting the characteristics of a test before it's given.
Review by thousands of educators, while helpful, isn't the same thing, "especially when it's being made operational in a high-stakes context," he says. "In my view, [the Massachusetts exam] was developed so quickly and without any reasonable pilot-testing that you've got to wonder about the technical quality of the examination."
To Silber, the complaints from higher education--"loaded with pretension and not competence"--miss the mark.
"Where did we decide, in order to validate exams in the English language of whether you can spell, write with correct grammar, and summarize an article with decent competence, that we have to go through an elaborate process of validation?" he asks.
As for the subject tests, Silber believes the mathematics and science tests are solid. But in broader areas like history, improvements can be made, he says.
Rep. Harold M. Lane Jr., the House chairman of the joint committee on education, the arts, and humanities, agrees. Lane, a Democrat who spent 20 years as a principal and has been heavily involved in the testing battle, would like to see an independent evaluation of the exam and its validity.
While the reading and writing tests seem reasonable, he says, the subject-matter tests "may be too broad."
Vol. 18, Issue 15, Pages 30-35Published in Print: November 25, 1998, as Test Questions