Report Says Texas Tests Aren't Tough Enough
The tests that support Texas' widely acclaimed accountability system started out weak and have been getting worse over time, a group of out-of-state researchers concludes in a report released last week.
By failing to test students at the appropriate grade level in mathematics and developing reading tests that have gotten progressively easier over the past four years, the Texas Assessment of Academic Skills sets low expectations and masks the real academic achievement of Texas students, argues the report, which was commissioned by the Tax Research Association of Houston and Harris County.
"We want our kids to have the full scope of opportunity that is not advanced when you have too much public relations in your accountability system," said George Scott, the president of the tax association, a non-profit group that has been critical of the state education agency in the past. "We want the truth."
But officials from the Texas Education Agency are defending the 4-year-old TAAS, saying Texas students are showing marked academic improvement thanks to hard work, not easy exams. Parallel gains on recent national exams only reinforce the success the state has shown its own assessments, said Debbie Graves Ratcliffe, a spokeswoman for the agency.
"No one said our tests were too easy when our students weren't doing particularly well on it," Ms. Ratcliffe said. "Now that we're seeing dramatic and huge gains, suddenly the critics are saying it's not hard enough."
The tax association, which is financed by Houston-area companies, decided to look into the quality of the assessment program when it noticed that the improved student-performance rates on the TAAS were not reflected in other exams, including the state-administered, end-of-course algebra tests, Mr. Scott said.
"The system has driven improvement, there's no doubt about that," he said. "All we're trying to do is raise legitimate concerns."
In an analysis of the state's 4th, 8th, and 10th grade reading tests given last spring, Sandra Stotsky, a research associate with the Harvard University graduate school of education, concluded that the reading passages are shorter, the vocabulary is easier, and the sentences are less complex than they were on the original exam, given in 1995.
This year's 4th grade reading exam, for example, had a total of 1,746 words, with an average of 291 words in each of six reading selections. In 1995, by contrast, the test contained 2,824 words, with an average of 403 words in each of seven selections, Ms. Stotsky found.
Employing a "readability" formula to determine the grade level targeted for each reading selection, Ms. Stotsky also found that the 1998 test had more selections geared to lower-level readers than the 1995 test. The 1996 and 1997 tests were also noticeably simpler than the 1995 test, she asserted.
"In the baseline year, they were testing the appropriate grade levels," said Ms. Stotsky, who co-chaired the committee that developed Massachusetts' rigorous new English standards. "But it just deteriorated year after year. These are not high-level expectations."
State education officials dispute Ms. Stotsky's findings, and say that the assessment system includes safeguards that ensure that the tests don't get easier over time. State officials link all new test questions to questions on previous exams to equate the level of difficulty, Ms. Ratcliffe said. "There is a process in place to make sure the hurdle the students go over remains the same year after year," she said.
The Texas Federation of Teachers also faulted the findings. "The TAAS is not becoming easier. It's becoming more difficult and complicated for students," said John Cole, the executive director of the Texas affiliate of the American Federation of Teachers. "These researchers clearly hadn't a clue about the truth."
While the math portion of the TAAS exams has maintained the same level of difficulty over time, the tests are consistently too easy, testing students well below what's appropriate for their grade levels, contends David Klein, a professor of mathematics at California State University-Northridge. As one of three California-based professors who analyzed the Texas math exam, Mr. Klein said he was shocked by the relative simplicity of many of the test questions.
"My daughter is in 3rd grade and I gave some of the [10th grade] questions to her, and she just did them," said Mr. Klein, who is known for advocating a back-to-basics approach to math instruction in California.
Still, he said, the TAAS exam is useful in guaranteeing that students have a minimum level of math proficiency. "It sets the bar low and makes sure everyone gets over it," Mr. Klein argued. "But the public deserves to know that this test isn't saying much."
In taking issue with the report's claims, Texas officials point to their students' attainment on the National Assessment of Educational Progress as evidence that the state assessments are adequately rigorous.
Earlier this month, the Washington-based National Education Goals Panel formally recognized both Texas and North Carolina as model states for raising student achievement on the assessment, which is given to a sampling of students nationwide. The two states posted the largest average gains in the country on seven key NAEP tests between 1990 and 1997. The panel cited Texas' standards-based assessment program as a factor that contributed to the state's improved performance on the national exams.
Still, Texas officials say they recognize that they must continue to fine-tune their assessments. The state is currently revising the TAAS tests to align them with the new curriculum standards the state school board adopted last year. In addition, Commissioner of Education Mike Moses and other state officials were scheduled to meet this week to discuss plans for continued improvements on the tests.
"We want to keep working and growing. But we absolutely stand by our test as it is." said Ann Smisko, the state's associate commissioner for curriculum, assessment, and technology.
Vol. 18, Issue 12, Page 13