Curriculum

New Test Measures Students’ Reading ‘Power’

By Susan Walton — May 25, 1983 12 min read
  • Save to favorites
  • Print

In 1982, the Boston Public Schools began administering a reading test that yields no complex subscores of “skills” and makes no mention of the grade level at which the students are reading.

That first year, officials gave the test to only a small number of students. Earlier this year, all 3rd through 12th graders in the system took the test. After the scores are analyzed, they will be given to the students’ current teachers, and later to their teachers l year. Both are expected to use the scores to help devise strategies to improve students’ reading comprehension.

The test is called the Degrees of Reading Power (drp). Sponsored by the College Board, it is a relative newcomer to a field that is frequently criticized for tests that tell teachers little about the progress of individual students or how well students understand what they read. The latter issue has received more attention lately in light of national surveys showing that students’ abilities to interpret material lag far behind their “basic skills.”

Boston was the first school system to adopt the drp, and although College Board officials can provide no statistics, they say use of the program is growing. New York State, Connecticut, and, most recently, Washington State, have begun to use it on a statewide basis.

Differs Significantly

The drp differs significantly from the other reading tests on the market in several ways, program officials say. Students are given short passages of increasing complexity with a word left out. Each of the four words provided as possible answers would make sense grammatically and within the meaning of the individual sentence. But only one word will make the sentence mesh with the meaning of the rest of the paragraph.

It is this format that allows those scoring the tests to tell whether the student understands the passage. Once the student receives his or her score, the teacher can match it to the scores of books and other materials whose “readability” has been calculated on the same scale. Both the student’s level and the book’s score are given in “degrees of reading power.” The scale goes from 15 to 100 degrees.

The three basic components of the program are the tests, the analysis of materials by the College Board, and a recently developed inservice effort, in which board consultants work with local teachers. The board issues an annual catalogue that gives the drp rating of books and other materials; about 2,000 textbooks have been analyzed to date. The board charges $375 to analyze a book for a school district or a publisher.

“It was designed to answer fundamentally one question: ‘What can you read and how well can you read it?”’ said Stephen H. Ivens, the program’s executive director at the College Board. “It also has norms, but that’s not its purpose. We wanted to measure directly the most difficult text a student can read with comprehension.”

“The key thing is that nobody has ever been able to develop a test where you could report results on a text-difficulty scale,” Mr. Ivens said. “That was the technical breakthrough.”

“It doesn’t give you the detail in diagnostics like some other tests do,” said Oliver W. Lancaster, Boston’s deputy superintendent for curriculum and instruction. “But if you put it together in the composite, it tells you whether you understand what you’ve read. At first, it tells you whether you understand the test. But once you understand, there’s not a lot of magical stuff. Can you or can’t you read? It’s about that simple.”

Gaining Cautious Acceptance

The drp seems to be gaining cautious acceptance in the reading-research community, as well. “I think it probably does represent an incremental improvement in the measurement of reading comprehension,” said Richard C. Anderson, director of the Center for the Study of Reading at the University of Illinois at Urbana-Champaign, who noted that he has not studied the test in depth.

“I can say I am guardedly enthusiastic,” said Mr. Anderson, who is also the president of the American Educational Research Association.

“The problem is,” he continued, “the nation has become so obsessed with testing. We’re placing so much weight on these tests that no test is really up to what we expect from it.”

The program has been criticized on some counts, including the issue of whether anyone besides the Col-lege Board can provide a sufficiently precise and consistent readability measure of any given book. But the key concern about the drp seems to be the possibility that teachers and publishers will treat it as an end rather than a means.

Readability Formulas

Researchers cite readability formulas--often accused of “driving” the development of basal readers--as an example of what can happen when an educational innovation becomes too strong a force in shaping the curriculum.

“We’re already seeing evidence that districts are preparing drills on the degrees of reading power,” Mr. Anderson said. “I heard from a colleague that a company is preparing materials to prepare for the drp test. I have to worry about whether that’s educationally desirable.”

Mr. Anderson also expressed concern about the possibility that textbook authors might write their books to suit a given drp level. “That’s going to be a terrible thing if that happens,” he said. “We’re just starting to make some headway in getting rid of some of these simplistic methods. I think it almost sinful if people start putting in criteria like that.”

“There’s clearly potential for educational abuse,” Mr. Ivens said. “Clearly, publishers could try to write texts to our formulas. The more serious abuse will come from teachers who think that once they understand drp, they can use it without interjecting their own judgment of the motivations and interests of the students and the quality of the texts.

“I worry that teachers will not exercise the care they need to in evaluating the text,” he continued. “They can’t use the drp as a substitute for all the other things they’re supposed to do.”

The key concepts in the College Board’s reading initiative evolved from the work of several researchers. John Bormuth of University of Chicago developed the formula on which the drp is based in 1969. Much of the subsequent research was done by Burt Koslin of Touchstone Applied Science Associates in New York, under a contract with the state.

“The state was interested and anxious to find an effectiveness test in reading,” said Winsor Lott, director of the division of educational testing for the state education department, describing the history of the program.

State officials knew of the drp when they started the state’s competency-testing program in the mid-1970’s, Mr. Lott said, but waited until it was further refined. Then, in the late 70’s, the Board of Regents became “rather dissatisfied” with the competency standards and requested more rigorous tests, he said. State officials began phasing in the drp in 1979, and students now take the test in the 3rd, 6th, 8th, 11th, and 12th grades.

The process of refining the test was, New York officials point out, aided by funding from the Carnegie Corporation of New York, which since 1976 has provided about $2 million to the Board of Regents’ research fund.

One of Several Efforts

The test was one of several efforts to develop alternatives to the standardized reading tests commonly used, according to Frederic Mosher, a program officer for Carnegie. It is also the most successful of these efforts, Mr. Mosher said.

The foundation’s interest in the reading program grew out of its broader interest in elementary and secondary education, in particular efforts to help schools “do better by kids they do less well by,” Mr. Mosher said. As part of that program, Carnegie officials looked at “outcomes"--the standardized-test scores.

“As we looked closely, we saw there was a problem in the kinds of outcomes that were being measured,” Mr. Mosher said. “We focused on problems of standardized tests and possibility of finding better ways.”

At that point, Carnegie encountered the work of Mr. Koslin. “I was very well impressed by what he’d produced, and it fit the line we’d begun to pursue in looking at what was wrong with conventional tests,” Mr. Mosher said.

In 1980, with Carnegie continuing to fund research through the Board of Regents, New York gave the program to the College Board. The drp has yet to break even financially and is subsidized by the board, officials say.

Studies Not Yet Complete

The final round of Carnegie-sponsored studies, which are examining the extent to which the use of the drp improves reading comprehension, are not yet complete. But interviews with some of the school officials using the program suggest that it is regarded as a useful tool. School districts use the program in different ways; some test all students, while others may focus on students who need special help with reading, or those in a particular grade.

In Boston, Mr. Lancaster said, the district plans to use the scores to help students improve their reading comprehension. “For youngsters in urban settings, the area where we need to build the most strength is the area of comprehension--understanding and responding to inferences,” he said. “The drp exam does speak directly to that, because you are given passages of increasing difficulty that you have to really understand in order to respond. It’s not based on facts, or on what you’re taught. We feel it’s a much more realistic test. By developing a series of scores [over the years], we feel we’ll get a better picture of what their needs are and what we’ll need to do to improve their reading.”

Teachers and administrators, he said, seem generally to like the program. He noted, however, that inservice was an important part of making it work.

A former administrator in the New York education department, Mr. Lancaster said that Boston probably avoided some of the confusion experienced by New York by making sure staff members understood the program.

In the Clark County, Nev., system, which includes the Las Vegas city schools, district officials are experimenting on “a very small scale,” according to Jean Serum, who coordinates the program for the district’s research and development department.

Next year, they will increase the number of students involved to 6,000, double this year’s total. “We’ve been very happy with the information we get through the test,” Ms. Serum said. “It’s been an excellent tool in placing students.” The district spent about $10,000 in materials for the 3,000 students.

New York officials are also enthusiastic. “I think it’s a tremendous testing instrument,” Mr. Lott said. “I think it’s one of the soundest measures of reading comprehension I’ve seen and far superior to other reading tests.”

“The other side is the instructional side,” he continued, “and the importance of matching materials to level. That will have a really beneficial effect on the teaching of reading in this state, and is already having it.”

Broad Instruction for Teachers

The major drawback, Mr. Lott noted, is that the program requires a lot of training. “It requires really broad instruction for teachers to acquaint them with the entire drp program nd how its used.” The other drawback, he said, is that the drp requires extensive readability analysis of materials to be fully functional and effective. “Eventually, we’ll reach that state but it’s a big thing,” he said.

Focus of Secondary Students

The District of Columbia is starting out on a small scale, restricting the program to 9th graders in eight schools, according to Helen Turner, supervising director of reading for the district. The district plans to focus on secondary students, she said, since those students tend to need the most work on comprehension.

“We found that many students have trouble at the secondary level,” she said. “There are many higher-level skills that aren’t taught.” Ms. Turner praised the College Board’s choice of inservice speakers, noting that their fees would otherwise exceed the district’s price range.

But although school officials there seem to agree that the drp is a ''viable” program, Ms. Turner said, they plan to evaluate the results carefully before deciding whether to continue it next year. “The community is still interested in getting a grade level,” she said. “And we don’t want to just add another test on top of what we have.”

The question about the drp that seems most likely to remain unanswered is whether school districts and publishers can adequately do the readability analysis of textbooks and other materials.

Measuring Performance

The College Board argues--strenuously--that they cannot. “If all we were doing was just analyzing texts for readability, how close the numbers come to each other would be less important,” Mr. Ivens said. “But when you’re measuring students’ test performance, the extent to which someone else makes errors in the calculation of the score jeopardizes the validity of the test. The whole scaling of those test results to the text readability scale is based on the development that we did of the readability scale.”

Idiosyncratic bits of texts, Mr. Ivens said, must be treated consistently. “Fundamentally, how you do all these things doesn’t make any difference if you do it the same way every time it happens. That’s one thing we do, is make sure the formula is applied consistently across texts,” he said.

“We did publish the Bormuth formula,” he noted. “Everyone has access to it, and they can try to do what we do. The real question is whether they can end up with what we end up with.”

Others, however, respond that the College Board is overstating its case.

‘Nothing But Market Puffery’

“They’re regarded as a group in testing with some savvy, but to say that no one else can do it would be nothing but market puffery,” Mr. Anderson said. “They’re using a fancy new statistical model. It’s probably an improvement on the classical psychometric model, but it’s not magic. There are certainly other individuals, places who could do it.”

“I don’t think that there’s anything that they do that is so sophisticated that it can’t be taught to those of us out here,” said Roger Farr, a reading researcher at Indiana University. Mr. Farr is currently working on a study of the drp

In Boston, school officials have opted to bypass the College Board’s analysis in favor of one conducted by a principal in the district who uses a computer model.

Mr. Lancaster pointed out that the principal was given 15 books to analyze that, unbeknownst to him, already had College Board ratings. For 11 of the books, he came up with identical scores; four were one point off.

A version of this article appeared in the May 25, 1983 edition of Education Week as New Test Measures Students’ Reading ‘Power’

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Smarter Tools, Stronger Outcomes: Empowering CTE Educators With Future-Ready Solutions
Open doors to meaningful, hands-on careers with research-backed insights, ideas, and examples of successful CTE programs.
Content provided by Pearson
Recruitment & Retention Webinar EdRecruiter 2026 Survey Results: How School Districts are Finding and Keeping Talent
Discover the latest K-12 hiring trends from EdWeek’s nationwide survey of job seekers and district HR professionals.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Professional Development Webinar
Recalibrating PLCs for Student Growth in the New Year
Get advice from K-12 leaders on resetting your PLCs for spring by utilizing winter assessment data and aligning PLC work with MTSS cycles.
Content provided by Otus

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Curriculum Shakespeare, Other Classics Still Dominate High School English
Despite efforts to diversify curricula, teachers still regularly assign many of the same classic works, a new survey finds.
6 min read
Illustration of bust of Shakespeare surrounded by books.
Chris Whetzel for Education Week
Curriculum Why Most Teachers Mix and Match Curricula—Even When They Have a 'High-Quality' Option
Teachers who supplement "may be signaling about inadequacies in the materials that are provided to them,” write the authors of a new report.
6 min read
An elementary school teacher helps a student with a writing activity.
An elementary school teacher helps a student with a writing activity.
Allison Shelley for All4Ed
Curriculum How Digital Games Can Help Young Kids Separate Fact From Fiction
Even elementary students need to learn how to spot misinformation.
3 min read
Aerial view of an diverse elementary school classroom using digital  devices with a digitized design of lines connecting each device to symbolize AI and connectivity of data and Information.
iStock/Getty
Curriculum Opinion How Much Autonomy Should Teachers Have Over Instructional Materials?
Some policymakers are pushing schools to adopt high-quality scripted lessons for teachers. And here's why.
8 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week