Critics' Study Finds Flaws in Teacher Exams
Massachusetts' controversial teacher-licensing tests are unreliable and of such poor validity that they should be discontinued, a study released last week by three critics concludes.
The critics--a testing expert, a professor of early childhood
education, and an education writer and consultant--call themselves the
Ad Hoc Committee to Test the Teacher Test. They organized last July
after massive failure rates on the first few administrations of the
exam prompted politicians to denounce the state's education schools and
question the quality of their teachers.
("Test Questions," Dec. 9, 1998.)
Committee members produced the 78-page paper on their own time and called a press conference last week at the State House in Boston to announce the findings.
"We did this as scholars and as concerned citizens because we thought what was being done was wrong," said Walter Haney, a senior research associate at the Center for the Study of Testing, Evaluation, and Educational Policy at Boston College.
The report, "Less Truth Than Error?," examines the communications and literacy portions of the Massachusetts Educator Certification Tests, which all prospective teachers must take to earn a license. Teacher-candidates also take exams in the subjects they plan to teach.
The tests were given for the first time last April and have since been administered three more times. On the first round, 59 percent of the test-takers failed at least one portion of the exams, prompting a fierce round of criticism.
The Massachusetts licensing tests have not been externally reviewed, the report says, and the state education department and testing company haven't produced technical documentation on their consistency and meaningfulness.
In the absence of such information, the authors were concerned that important decisions were being made on the basis of a "hastily developed" test, they write.
For More Information:
The test's critics initially hoped to compare teacher-candidates' scores on the Massachusetts tests with their scores on other postcollegiate examinations, such as Praxis. But their efforts to gather such data didn't produce a large enough sample. Instead, the study analyzes the scores of a group of prospective teachers who took the test twice because they failed at least one part.
Reasoning that adults' basic skills in reading and writing wouldn't change much in three months, the authors of the study compared the scores of 219 candidates who retook the exams in July. The study found large fluctuations in scores for those people, which its authors say make the tests "highly unreliable."
The tests' margin of error was double to triple the range found on well-developed tests, the authors say. The wide margin of error yielded "huge fluctuations" in scores for people taking the tests more than once, including some score changes that the authors deemed "truly bizarre."
"A person who received a score of 72 on the writing test could have scored an 89 or a 55 simply because of the unreliability of the test," the report maintains.
The authors interviewed 15 test-takers. Their responses, while representing a small, self-selected sample, suggest that problems with the test may have stemmed from the lack of a study guide, confusion over whether the results would count, poor administration of the test, fatigue from its eight-hour length, and questionable content, the report says.
Commissioner of Education David P. Driscoll said the critics had taken "the wrong focus" by attacking the test rather than the poor scores. "The story is the lack of skills among many of those candidates," Mr. Driscoll said. "This is about people not being able to meet a standard."
A technical report is under way, the commissioner noted, and an external-review committee will conduct a thorough review of that report.
Vol. 18, Issue 23, Page 3