Assessment

NAEP Assigns No Grades on Science Exam

By Millicent Lawton — May 07, 1997 5 min read
  • Save to favorites
  • Print

The nation’s science report card came out last week, but it lacks an important feature--grades.

U.S. Department of Education officials issued the results from the 1996 National Assessment of Educational Progress in science without the now-customary explanation of how well students performed against set levels of basic, proficient, and advanced achievement. That omission leaves the students comparable only with one another.

The executive committee of the National Assessment Governing Board, the independent citizens’ panel that sets policy for NAEP and defines the achievement standards, decided April 15 to delay releasing the part of the report that makes a judgment about student performance.

The move drew jeers from some observers, who said it left the report bereft of helpful information. But it earned the support of others, who underscored the gravity of taking care on such an important task.

Michael J. Guerra, a governing-board member, said in a prepared statement that the board simply wasn’t ready to sign off on describing what students should know and be able to do in science.

“The board has not quite finished the job of weighing all the factors involved and making a final decision on what the levels should be,” said Mr. Guerra, who is also the executive director of secondary schools at the National Catholic Educational Association. NAGB plans to release the data on how students did relative to achievement levels in the fall, said Roy Truby, the governing board’s executive director.

The National Center for Education Statistics forged ahead without the performance standards because “we’ve made a commitment to bring out the data on our assessment on an earlier schedule than has heretofore been the case,” said Arnold A. Goldstein, a policy analyst at the NCES, an arm of the Education Department. “We wanted to release what was available at this time in order to get the information to the public.”

The national assessment is the only ongoing, nationally representative measure of U.S. students’ achievement in several core subject areas. Mandated by Congress, it has been given since 1969 and is a project of the statistics center.

A total of 22,616 public and private school students in the 4th, 8th, and 12th grades took the 1996 science exam. Forty-three states and the District of Columbia participated.

New Structure

Because the science test, last given in 1990, is a newly revised one based on a new content framework, the stakes are high for the governing board to be confident of the levels of achievement they expect from students and to be able to describe those levels in a way compatible with their legal charge. According to statute, the standards must be “reasonable, valid, and informative to the public.”

This year, for the first time, the exam included hands-on science tasks or experiments. The exam asked students to spend 80 percent of the testing time on open-ended questions--such as those related to hands-on work--in which they write in their own answers, as compared with multiple-choice queries.

That change complicated the board’s task of deciding what made student work “good enough” at each grade level, said those familiar with the process, in part because some of the answers could earn partial credit.

In addition to weighing outside advice from teachers and others, the board had to consider how its standards for U.S. science achievement stack up against the ones established as part of the Third International Mathematics and Science Study. The 41-country results of how 8th graders did on the science part of TIMSS were issued last fall. (“U.S. Students About Average in Global Study, Nov. 27, 1996.) But the board has not seen the 4th grade results on TIMSS; that report is slated for release next month.

“We felt it was better to get it right than get it quick,” Mr. Guerra said.

Results Disappoint

What the report, “NAEP 1996 Science: Report Card for the Nation and the States,” relays is students’ ability to tackle certain types of questions.

Ninety percent of 4th graders could identify items that conduct electricity, but fewer than half understood what causes windows to rattle during a thunderstorm. Fewer than one-fourth of 8th graders could identify areas that have warm summers and cold winters. About 90 percent of seniors could discern, given some data, which planet has the longest year, but only slightly more than 10 percent could understand the role of trees in the water cycle.

By the end of high school, students seem to have some grasp of basic facts and principles, can read graphs, and can carry out directions to do simple experiments, NAGB member Michael T. Nettles said in a statement.

“But when NAEP asks students to go beyond that--to apply scientific knowledge to a new situation, to design an experiment themselves, or to explain their reasoning clearly--the results are disappointing,” said Mr. Nettles, who is a University of Michigan professor and the executive director of the Frederick D. Patterson Research Institute of the College Fund/UNCF.

Senta A. Raizen, the director of the National Center for Improving Science Education, based in Washington, said she was encouraged that 8th grade boys did not significantly outperform girls, as has been the case on previous NAEP science assessments. As earlier exams have found, the gender gap is not apparent in the 4th grade on this latest test, but it persists for 12th grade boys and girls.

Ms. Raizen also said it was not surprising that students didn’t do as well on the kinds of open-ended queries and hands-on tasks advocated by the voluntary national science standards, because “many of the reform goals haven’t, as yet, been incorporated.”

But without the achievement-level information, said Chester E. Finn Jr., a former NAGB member, the report “won’t be very useful.”

“Imagine looking at a thermometer after it has come out of your mouth,” said Mr. Finn, who is the John M. Olin senior fellow at the Hudson Institute, “and seeing the column of mercury but seeing no scale, no little lines, no numbers.”

The report, minus the criterion for determining how well students did, recalls NAEP before the board introduced achievement levels in 1990. But, Mr. Finn argued, the national assessment then “was not very interesting to legislatures or governors or anybody else because it never could answer the question, ‘How good is good enough?’”

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Stop the Drop: Turn Communication Into an Enrollment Booster
Turn everyday communication with families into powerful PR that builds trust, boosts reputation, and drives enrollment.
Content provided by TalkingPoints
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Integrating and Interpreting MTSS Data: How Districts Are Designing Systems That Identify Student Needs
Discover practical ways to organize MTSS data that enable timely, confident MTSS decisions, ensuring every student is seen and supported.
Content provided by Panorama Education
Artificial Intelligence Live Online Discussion A Seat at the Table: AI Could Be Your Thought Partner
How can educators prepare young people for an AI-powered workplace? Join our discussion on using AI as a cognitive companion.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Opinion I Don’t Offer My Students Extra Credit. Here’s What I Do Instead
There isn’t anything "extra," but there is plenty my students can do to improve their grade.
Joshua Palsky
4 min read
A student standing on a letter A mountain peak with other letter grades are scattered in the vast landscape.
Vanessa Solis/Education Week + DigitalVision Vectors
Assessment Download How Digital Portfolios Help Students Showcase Skills and Growth
Electronic folders showcase student learning and growth over time, and can form a platform for post-high school endeavors.
1 min read
Vector illustration image with icons of digital portfolio concepts: e-portfolios; goals; ideas; feedback; projects, etc.
iStock/Getty
Assessment Here's What Teachers Really Think About Equitable Grading Policies
A new study examines the prevalence of policies like no zeroes or unlimited retakes in classrooms.
4 min read
A classroom is seen at Woodmore Elementary @ Meadowbrook on August 15, 2025 in Bowie, Maryland. In a so-called ‘swing move,’ Woodmore Elementary has relocated to Meadowbrook Elementary school until Summer 2027.
A classroom is seen at Woodmore Elementary @ Meadowbrook on August 15, 2025 in Bowie, Md. A new survey shows most teachers have begun to use some elements of what's known as equitable grading.
Pete Kiehart for Education Week
Assessment What Teachers Really Think About State Testing
State testing remains a complicated debate amongst educators as the end-of-year assessments take place.
1 min read
A teacher points to a board as students listen in a fourth grade classroom at William Jefferson Clinton Elementary in Compton, Calif., on Feb. 6, 2025.
A teacher points to a board as students listen in a fourth grade classroom at William Jefferson Clinton Elementary in Compton, Calif., on Feb. 6, 2025. State testing happens every spring and educators share their thoughts on whether these assessments accurately reflect student learning.
Eric Thayer/AP