Assessment

NAEP Assigns No Grades on Science Exam

By Millicent Lawton — May 07, 1997 5 min read
  • Save to favorites
  • Print

The nation’s science report card came out last week, but it lacks an important feature--grades.

U.S. Department of Education officials issued the results from the 1996 National Assessment of Educational Progress in science without the now-customary explanation of how well students performed against set levels of basic, proficient, and advanced achievement. That omission leaves the students comparable only with one another.

The executive committee of the National Assessment Governing Board, the independent citizens’ panel that sets policy for NAEP and defines the achievement standards, decided April 15 to delay releasing the part of the report that makes a judgment about student performance.

The move drew jeers from some observers, who said it left the report bereft of helpful information. But it earned the support of others, who underscored the gravity of taking care on such an important task.

Michael J. Guerra, a governing-board member, said in a prepared statement that the board simply wasn’t ready to sign off on describing what students should know and be able to do in science.

“The board has not quite finished the job of weighing all the factors involved and making a final decision on what the levels should be,” said Mr. Guerra, who is also the executive director of secondary schools at the National Catholic Educational Association. NAGB plans to release the data on how students did relative to achievement levels in the fall, said Roy Truby, the governing board’s executive director.

The National Center for Education Statistics forged ahead without the performance standards because “we’ve made a commitment to bring out the data on our assessment on an earlier schedule than has heretofore been the case,” said Arnold A. Goldstein, a policy analyst at the NCES, an arm of the Education Department. “We wanted to release what was available at this time in order to get the information to the public.”

The national assessment is the only ongoing, nationally representative measure of U.S. students’ achievement in several core subject areas. Mandated by Congress, it has been given since 1969 and is a project of the statistics center.

A total of 22,616 public and private school students in the 4th, 8th, and 12th grades took the 1996 science exam. Forty-three states and the District of Columbia participated.

New Structure

Because the science test, last given in 1990, is a newly revised one based on a new content framework, the stakes are high for the governing board to be confident of the levels of achievement they expect from students and to be able to describe those levels in a way compatible with their legal charge. According to statute, the standards must be “reasonable, valid, and informative to the public.”

This year, for the first time, the exam included hands-on science tasks or experiments. The exam asked students to spend 80 percent of the testing time on open-ended questions--such as those related to hands-on work--in which they write in their own answers, as compared with multiple-choice queries.

That change complicated the board’s task of deciding what made student work “good enough” at each grade level, said those familiar with the process, in part because some of the answers could earn partial credit.

In addition to weighing outside advice from teachers and others, the board had to consider how its standards for U.S. science achievement stack up against the ones established as part of the Third International Mathematics and Science Study. The 41-country results of how 8th graders did on the science part of TIMSS were issued last fall. (“U.S. Students About Average in Global Study, Nov. 27, 1996.) But the board has not seen the 4th grade results on TIMSS; that report is slated for release next month.

“We felt it was better to get it right than get it quick,” Mr. Guerra said.

Results Disappoint

What the report, “NAEP 1996 Science: Report Card for the Nation and the States,” relays is students’ ability to tackle certain types of questions.

Ninety percent of 4th graders could identify items that conduct electricity, but fewer than half understood what causes windows to rattle during a thunderstorm. Fewer than one-fourth of 8th graders could identify areas that have warm summers and cold winters. About 90 percent of seniors could discern, given some data, which planet has the longest year, but only slightly more than 10 percent could understand the role of trees in the water cycle.

By the end of high school, students seem to have some grasp of basic facts and principles, can read graphs, and can carry out directions to do simple experiments, NAGB member Michael T. Nettles said in a statement.

“But when NAEP asks students to go beyond that--to apply scientific knowledge to a new situation, to design an experiment themselves, or to explain their reasoning clearly--the results are disappointing,” said Mr. Nettles, who is a University of Michigan professor and the executive director of the Frederick D. Patterson Research Institute of the College Fund/UNCF.

Senta A. Raizen, the director of the National Center for Improving Science Education, based in Washington, said she was encouraged that 8th grade boys did not significantly outperform girls, as has been the case on previous NAEP science assessments. As earlier exams have found, the gender gap is not apparent in the 4th grade on this latest test, but it persists for 12th grade boys and girls.

Ms. Raizen also said it was not surprising that students didn’t do as well on the kinds of open-ended queries and hands-on tasks advocated by the voluntary national science standards, because “many of the reform goals haven’t, as yet, been incorporated.”

But without the achievement-level information, said Chester E. Finn Jr., a former NAGB member, the report “won’t be very useful.”

“Imagine looking at a thermometer after it has come out of your mouth,” said Mr. Finn, who is the John M. Olin senior fellow at the Hudson Institute, “and seeing the column of mercury but seeing no scale, no little lines, no numbers.”

The report, minus the criterion for determining how well students did, recalls NAEP before the board introduced achievement levels in 1990. But, Mr. Finn argued, the national assessment then “was not very interesting to legislatures or governors or anybody else because it never could answer the question, ‘How good is good enough?’”

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment What the Research Says What Teachers Should Know About Integrating Formative Assessment With Instruction
Teachers need to understand how tests fit into their larger instructional practice, experts say.
3 min read
Students with raised hands.
E+ / Getty
Assessment AI May Be Coming for Standardized Testing
An international test may offer clues on how AI can help create better assessments.
4 min read
online test checklist 1610418898 brightspot
champpixs/iStock/Getty
Assessment The 5 Burning Questions for Districts on Grading Reforms
As districts rethink grading policies, they consider the purpose of grades and how to make them more reliable measures of learning.
5 min read
Grading reform lead art
Illustration by Laura Baker/Education Week with E+ and iStock/Getty
Assessment As They Revamp Grading, Districts Try to Improve Consistency, Prevent Inflation
Districts have embraced bold changes to make grading systems more consistent, but some say they've inflated grades and sent mixed signals.
10 min read
Close crop of a teacher's hands grading a stack of papers with a red marker.
E+