High school exit exams do not significantly affect dropout rates, according to a study released last week by the Manhattan Institute for Policy Research.
The study, released April 28, also found that neither reducing class sizes nor increasing spending on education increases graduation rates. Research findings surrounding exit exams and dropout rates to date have been mixed, said Jay P. Greene, a senior fellow at the Manhattan Institute and a co-author of the study.
“One of the major concerns that people have is that exit exams will drive low graduation rates even lower,” said Mr. Greene. “That’s a plausible concern.”
But Mr. Greene and co-author Marcus A. Winters, a research associate at the think tank, found through their calculations that while some students may be unable to get diplomas because they failed to pass exit exams, other students are motivated by the exams and work harder, and some schools better serve their students, thus increasing graduation rates. The groups offset each other, keeping graduation rates steady, Mr. Greene said.
Exit exams and graduation rates have been the focus of much attention, as some states struggle to implement such high-stakes tests amid criticism. Some parents and students have complained that students who complete required coursework but can’t pass the tests are being unfairly derailed from future plans.
Mr. Greene said many people have assumed the tests were causing more students to drop out of school without getting diplomas, but he said some of those in favor of exit exams see that as an acceptable price for achieving student accountability. Twenty-four states and the District of Columbia either currently have high school exit exams or have adopted measures to implement them in the near future.
The report compares the graduation rates of states with exit exams against the rates of those without them. The authors examined whether there was a change in the graduation rate at the point when a state adopted its exam, and they compared the patterns they found among the states.
Keith Gayler, the associate director of the Washington-based Center on Education Policy, said after reading the report that he doubted that the Manhattan Institute’s study would settle the debate over the impact of exit exams. The report by the New York City-based institute did not take into account other variables that could influence dropout rates, he said.
“There’s only so many data points they can control for,” Mr. Gayler said. The report, he added, “points out again the difficulty in answering the question now … and just means that the debate is still up in the air.”
Last year, the Center on Education Policy released a report saying that exit exams appear to encourage school districts to cover more course content and align that content with state standards. The 2003 report also found that a “moderate amount of evidence” showed that such exams increased dropout rates.
But Michael Cohen, the president of Achieve, a Washington-based nonprofit organization focused on standards-based education, and a former assistant U.S. secretary of education under President Clinton, said the Manhattan Institute study “makes a pretty good case, given the data available and the care they take to analyze the dropout rates.”
The new report, he said, calls attention to an important point: Exit exams are generally not very rigorous and are “not making unreasonable demands on students in terms of the knowledge and skills that they measure.”
“These are tests that all kids should be able to pass,” Mr. Cohen said. Achieve is working on a detailed analysis of exit exams which should be available in June, he said.
The report by the Manhattan Institute also factored in data concerning class size and per-pupil spending and found that such factors did not affect graduation rates, a finding that Mr. Greene said he found disappointing.
These factors don’t “seem to have an effect on graduation rates for good or bad,” he said. “It would be nice to find something that worked.”
Mr. Greene said he hopes the study will be helpful to states that are either considering modifications to their tests or delaying implementation of them.
“We think this information could be useful in those policy debates,” he said.
*Estimated year of implementation
SOURCE: Manhattan Institute for Policy Research; Education Week research