DIBELS Involved in ‘Reading First’ Controversies
Although teachers in the Moriarty, N.M., public schools report positive experiences with the Dynamic Indicators of Basic Early Literacy Skills, or DIBELS, the assessments have generated a lot of controversy nationally.
The assessment tool, developed by researchers at the University of Oregon, is now approved for use under the federal Reading First program in 45 states to monitor student progress on reading fluency and other measures.
But a contentious hearing before the U.S. House Education and Labor Committee probed allegations that the widespread use of DIBELS may stem, in part, from inappropriate promotion of the tests by federal officials as part of the rollout of the $1 billion-a-year Reading First program. ("House Panel Grills Witnesses on Reading First," April 25, 2007.)
A report by the U.S. Department of Education’s inspector general, released in March, suggested that a federal contractor did not appropriately screen consultants, some of whom had financial ties to DIBELS, for conflicts of interest. An earlier IG report concluded that the Education Department appeared to promote DIBELS over other assessments during workshops designed to help state officials complete the rigorous Reading First grant application.
The University of Oregon researchers who developed DIBELS served as advisers on the design of three department-sponsored Reading Leadership Academies in winter 2002 and the resource materials that were handed out at them. They also presented sessions at the events and later were consultants on implementing Reading First.
In addition, the inspector general found some evidence that officials in several states may have been directed to adopt DIBELS rather than the assessments they’d initially selected for use in Reading First schools. The inspector general cited conflicts of interest involving several federal consultants with financial ties to DIBELS who were sent to advise states that ended up including those products in their grant proposals.
Federal officials and consultants have said that they acted properly in their decisions involving DIBELS. At the April 20 hearing, however, House Democrats charged that the developers of DIBELS profited from the advice they gave to federal and state officials for Reading First.
Meanwhile, critics also charge that DIBELS’ ability to measure students’ reading skills is being oversold.
“First of all, it’s a very narrow instrument,” said Samuel J. Meisels, the president of the Erickson Institute for Advanced Study in Child Development, in Chicago. “It measures all kinds of fluency: initial-sound fluency, nonsense-word fluency, oral-reading fluency, word-use fluency, and then combinations of those fluencies, which they call comprehension.
“Essentially, what we’re talking about is speed,” he continued, “and, in the case of the nonsense words, reading out of context,” which he said is especially a concern for beginning readers who do not come from literacy-rich environments.
At least one study questions whether the assessment tool is a good measure of whether students understand what they read. The study, by researchers at Michigan State University, focused on the use of DIBELS among 3rd graders in one small school district in the Midwest.
It found that DIBELS Oral Reading Fluency scores did predict performance on the TerraNova, a standardized achievement test, although students’ performance on DIBELS accounted for less than 20 percent of the variability in those scores. The study also found that students scored poorly on their ability to retell stories they had read, suggesting the tests may be sending a message that reading rapidly is more important than reading for comprehension.
The authors, G. Michael Pressley, Katherine Hilden, and Rebecca Shankland, suggest a need for more studies of DIBELS by scholars not associated with the test, including research on how well it predicts performance on measures reflecting the full range of reading skills.
Though studies have found that DIBELS predicts students’ scores on state reading tests, Mr. Meisels argued that the studies he’s seen involve too few students to generalize from the results.
So why do teachers like DIBELS? “There’s always a great deal that’s gained by a one-on-one assessment,” Mr. Meisels said. “You do get teachers looking at kids and listening to them and using analytical skills. All that’s really great.”
Vol. 26, Issue 35, Page 31