The National Assessment Governing Board is exploring whether to use the National Assessment of Educational Progress to measure students' workplace skills.
At its quarterly meeting last month, the board heard a report that weighed the feasibility of using "the nation's report card'' to measure the kinds of skills called for in the 1991 report by the U.S. Secretary of Labor's Commission on Achieving Necessary Skills (SCANS).
The results were mixed. According to the study, NAEP provides considerable information about students' basic academic and thinking skills--much of what the SCANS report refers to as the foundation of workplace know-how.
It is less useful, however, for measuring personal qualities such as honesty and responsibility and at gauging how well students can work productively with others, manage resources, acquire information, and understand and master complex systems.
The board took no action on the report, which was prepared by Ina Mullis, the executive director of the Educational Testing Service's Center for the Assessment of Educational Progress. The E.T.S. administers NAEP under a federal contract.
Students' grade levels mean little in terms of mathematics achievement, a study by another E.T.S. researcher suggests.
Looking at the 1990 and 1992 NAEP tests in mathematics, the researcher, Paul Barton, charted the performance of 4th, 8th, and 12th graders on the same proficiency scale. Up until 1984, the results for each of those grade levels were reported on separate scales.
He found that 4th graders who scored highest on the tests--specifically, those who fell in the 75th percentile--were about even with 8th graders who scored toward the lower end of the scale. The scores of those 4th graders also paralleled those of 12th graders who ranked near the bottom.
"In some sense, then, the term 'grade level' is meaningless in the United States, for it tells little about what students know and can do,'' Mr. Barton writes in the summer issue of the company's newsletter, E.T.S. Policy Notes.
In another article in the same issue of the newsletter, Mr. Barton also examines what trends emerged when the performances of certain groups of students on the NAEP math test were tracked over time. In 1982, for example, 9-year-old girls scored higher on average than their male counterparts. By the time they turned 17 in 1990, however, their performance had fallen below that of the boys.
However, that gap at the older ages appears to be narrowing with successive waves of female students, Mr. Barton notes.
Meanwhile, minority students' scores rose at about the same rate as those of white students from 1982 to 1990. But because white pupils started out, at age 9, with higher scores than their minority counterparts, the performance gap between the groups remained wide at age 17.
"Schools are succeeding at raising proficiency in mathematics about the same amount'' for whites, blacks, and Hispanics, Mr. Barton writes. "What they have not been able to do is make up for differences that 9-year-olds bring with them to school.''
Looking ahead to future NAEP exams, the U.S. Education Department has put together a draft plan for the upcoming reauthorization of the testing program that would allow local test scores to be published.
Under the proposal, the federal government would still be barred from publishing local results, Emerson J. Elliott, the commissioner of the National Center for Education Statistics, told the assessment governing board last month. States and localities, though, would have the option of publishing them.
Other proposed changes in the draft plan, which has not yet been approved by the federal Office of Management and Budget, include: allowing NAEP to be administered every year rather than biennially; making the pilot state-assessment program a formal part of NAEP; expanding the governing board to add more technical experts and members of the public; and allowing the Education Department to require states to share more of NAEP costs.
Mr. Elliott said the Clinton Administration plans to present a bill for the 1995 reauthorization in early fall.
The Baltimore city schools and C.T.B. Macmillan/McGraw-Hill have launched a $750,000 effort to develop a "new wave'' of alternative assessments for teachers to use in the classroom.
The new tests are aimed at students in grades 2 through 5. They will consist of "formative'' assessments, which take place as students learn, and of "summative'' assessments to be given at the end of a year or a curriculum unit.
Nicholas Maruhnich, the director of the company's alternative-assessment program, said the new assessment would "look, feel, and behave'' a lot like the end-of-the-year assessments that make up Maryland's statewide alternative-assessment program, which calls for students to undertake "hands on,'' collaborative tasks to show what they know.
The state program has prompted many teachers to use similar teaching approaches in their classrooms, said Mr. Maruhnich. But, he added, "not all teachers are doing this good stuff.''
"Now we can be sure all teachers will be asking these kinds of questions,'' he said.--D.V.
Vol. 13, Issue 01