E.T.S. Revises National Assessments To Make Them 'More Useful'
Over the next two months, some 30,000 13-year-old students from across the nation will be evaluated on their proficiency in reading and writing by the National Assessment of Educational Progress.
But the sample group and the nature of the assessment will be different this year than in previous years. Since the Educational Testing Service last February won the $19-million, five-year federal contract to administer the assessment, it has introduced a number of changes into the program, many designed to make its results more useful to local educators evaluating the effectiveness of their academic programs.
The new naep design responds to "new social and environmental changes" and takes into account current national concerns that "focus on performance standards, school-effectiveness questions, and broad human-resource issues," according to A New Design for a New Era, a booklet on the assessment produced by the ets
Estimates of Performance
The report notes that the assessment will be changed to permit estimates of performance and patterns for 4th, 8th, and 11th graders as well as for 9-, 13- and 17-year-olds. Previously, the assessment reported performance in terms of age only, not by grade. This policy stemmed from the late 1960's when the nationwide evaluation was established, according to the testing service, in response to concerns that a national assessment tied to schools would encourage comparisons between school districts and states, possibly leading to greater federal involvement in curriculum and governance.
Now, the concern that the assessment would result in a national curriculum and be a threat to the "local control" of states and districts has abated, ets officials say. Their naep study notes that the inclusion of grade-level sampling will permit local educators "to link national assessment results to school practices, state and local assessments, and educational policies, most of which are typically tied to grade level." (ets officials note that it is important to maintain the age distinction because many students are over the average age for their grade level, and reporting by grade level only would tend to distort results.)
Limited English Proficiency
Moreover, the assessment will now gather information about students whose proficiency in English is limited and students who are functionally handicapped in order to develop assessments for them. Beginning with the next round of assessments in 1985, naep will also separate Hispanic students by their major cultural subgroups--Puerto Rican, Cuban, and Mexican-American.
Samples of adults and out-of-school 17-year-olds, beginning in 1985, will be "reintroduced into the assessment by cost-effective means that also link the exercise performance levels of these groups to labor-force participation data and employment trends," the report notes.
"Originally, the sampling was designed basically to gather data that would indicate differences between regions of the country (the North, South, Northwest, and Southwest); between the sexes; between ethnic groups (black, white, and Hispanic); between large metropolitan areas and smaller areas; and also in terms of at least two levels of socioeconomic status based on the education of parents," said Samuel Messick, one of the authors of the report.
"The prime emphasis was on single exercises that committees of curriculum experts determined would measure skills important for 9-, 13-, and 17-year-old students to know about reading, science, and mathematics," he said.
"The assessment noted only the percent of students able to complete a specific exercise in the belief that the exercise clearly captured an important educational objective," Mr. Messick added. "But testmakers have learned that few exercises carry any one objective."
The new assessment will present a series of exercises that can be analyzed by computer to measure a number of learning factors in various combinations. A technique called "balanced incomplete block spiraling'' will bring to the assessment a way to examine "all possible pairs of items" and "the interrelationship between all questions," according to Mr. Messick.
For example, the test will allow evaluators to determine how closely reading performance is related to writing skills and, ultimately, to solving mathematical problems, according to Mr. Messick.
In its report, the ets researchers say that naep should provide data to answer questions about school effec-tiveness as well as about whether students are learning skills adequate to meet society's needs or their own career goals.
Tool To Identify Problems
The study says that if naep is conceived "not merely as a social indicator, but as a tool to identify problems and suggest areas of ... research concerning educational progress," it should attempt to provide data that address a wide range of policy issues."
The ets report suggests that the assessment should try to determine: whether students in programs requiring minimum competencies or graduation-test requirements perform better academically than other students; how pupil-teacher ratios appear to relate to achievement; how preschool and kindergarten experiences affect student achievement; and how particular curricular approaches relate to student achievement in various fields.
In these and other areas, "timely analyses of the achievement data in relation to relevant background and program variables should suggest provisional interpretations and promising leads that merit further research or special naep probe studies," the report says.
Instead of assessing one or two subject areas each year, naep will now assess four subject areas every other year, according to the report. This arrangement will cost less and give evaluators the power to examine relationships across subject areas.
"What is lost is collecting new data every year," Mr. Messick said. "But we feel that we will more than compensate with improved ability to examine trends."
The naep report is available for $6.75 from the National Assessment of Educational Progress, Box 2923, Princeton, N.J. 08541.
Vol. 03, Issue 07