Assessment

NAEP Assigns No Grades on Science Exam

By Millicent Lawton — May 07, 1997 5 min read
  • Save to favorites
  • Print

The nation’s science report card came out last week, but it lacks an important feature--grades.

U.S. Department of Education officials issued the results from the 1996 National Assessment of Educational Progress in science without the now-customary explanation of how well students performed against set levels of basic, proficient, and advanced achievement. That omission leaves the students comparable only with one another.

The executive committee of the National Assessment Governing Board, the independent citizens’ panel that sets policy for NAEP and defines the achievement standards, decided April 15 to delay releasing the part of the report that makes a judgment about student performance.

The move drew jeers from some observers, who said it left the report bereft of helpful information. But it earned the support of others, who underscored the gravity of taking care on such an important task.

Michael J. Guerra, a governing-board member, said in a prepared statement that the board simply wasn’t ready to sign off on describing what students should know and be able to do in science.

“The board has not quite finished the job of weighing all the factors involved and making a final decision on what the levels should be,” said Mr. Guerra, who is also the executive director of secondary schools at the National Catholic Educational Association. NAGB plans to release the data on how students did relative to achievement levels in the fall, said Roy Truby, the governing board’s executive director.

The National Center for Education Statistics forged ahead without the performance standards because “we’ve made a commitment to bring out the data on our assessment on an earlier schedule than has heretofore been the case,” said Arnold A. Goldstein, a policy analyst at the NCES, an arm of the Education Department. “We wanted to release what was available at this time in order to get the information to the public.”

The national assessment is the only ongoing, nationally representative measure of U.S. students’ achievement in several core subject areas. Mandated by Congress, it has been given since 1969 and is a project of the statistics center.

A total of 22,616 public and private school students in the 4th, 8th, and 12th grades took the 1996 science exam. Forty-three states and the District of Columbia participated.

New Structure

Because the science test, last given in 1990, is a newly revised one based on a new content framework, the stakes are high for the governing board to be confident of the levels of achievement they expect from students and to be able to describe those levels in a way compatible with their legal charge. According to statute, the standards must be “reasonable, valid, and informative to the public.”

This year, for the first time, the exam included hands-on science tasks or experiments. The exam asked students to spend 80 percent of the testing time on open-ended questions--such as those related to hands-on work--in which they write in their own answers, as compared with multiple-choice queries.

That change complicated the board’s task of deciding what made student work “good enough” at each grade level, said those familiar with the process, in part because some of the answers could earn partial credit.

In addition to weighing outside advice from teachers and others, the board had to consider how its standards for U.S. science achievement stack up against the ones established as part of the Third International Mathematics and Science Study. The 41-country results of how 8th graders did on the science part of TIMSS were issued last fall. (“U.S. Students About Average in Global Study, Nov. 27, 1996.) But the board has not seen the 4th grade results on TIMSS; that report is slated for release next month.

“We felt it was better to get it right than get it quick,” Mr. Guerra said.

Results Disappoint

What the report, “NAEP 1996 Science: Report Card for the Nation and the States,” relays is students’ ability to tackle certain types of questions.

Ninety percent of 4th graders could identify items that conduct electricity, but fewer than half understood what causes windows to rattle during a thunderstorm. Fewer than one-fourth of 8th graders could identify areas that have warm summers and cold winters. About 90 percent of seniors could discern, given some data, which planet has the longest year, but only slightly more than 10 percent could understand the role of trees in the water cycle.

By the end of high school, students seem to have some grasp of basic facts and principles, can read graphs, and can carry out directions to do simple experiments, NAGB member Michael T. Nettles said in a statement.

“But when NAEP asks students to go beyond that--to apply scientific knowledge to a new situation, to design an experiment themselves, or to explain their reasoning clearly--the results are disappointing,” said Mr. Nettles, who is a University of Michigan professor and the executive director of the Frederick D. Patterson Research Institute of the College Fund/UNCF.

Senta A. Raizen, the director of the National Center for Improving Science Education, based in Washington, said she was encouraged that 8th grade boys did not significantly outperform girls, as has been the case on previous NAEP science assessments. As earlier exams have found, the gender gap is not apparent in the 4th grade on this latest test, but it persists for 12th grade boys and girls.

Ms. Raizen also said it was not surprising that students didn’t do as well on the kinds of open-ended queries and hands-on tasks advocated by the voluntary national science standards, because “many of the reform goals haven’t, as yet, been incorporated.”

But without the achievement-level information, said Chester E. Finn Jr., a former NAGB member, the report “won’t be very useful.”

“Imagine looking at a thermometer after it has come out of your mouth,” said Mr. Finn, who is the John M. Olin senior fellow at the Hudson Institute, “and seeing the column of mercury but seeing no scale, no little lines, no numbers.”

The report, minus the criterion for determining how well students did, recalls NAEP before the board introduced achievement levels in 1990. But, Mr. Finn argued, the national assessment then “was not very interesting to legislatures or governors or anybody else because it never could answer the question, ‘How good is good enough?’”

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Smarter Tools, Stronger Outcomes: Empowering CTE Educators With Future-Ready Solutions
Open doors to meaningful, hands-on careers with research-backed insights, ideas, and examples of successful CTE programs.
Content provided by Pearson
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Professional Development Webinar
Recalibrating PLCs for Student Growth in the New Year
Get advice from K-12 leaders on resetting your PLCs for spring by utilizing winter assessment data and aligning PLC work with MTSS cycles.
Content provided by Otus
School Climate & Safety Webinar Strategies for Improving School Climate and Safety
Discover strategies that K-12 districts have utilized inside and outside the classroom to establish a positive school climate.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Should Teachers Allow Students to Redo Classwork?
Allowing students to redo assignments is another aspect of the traditional grading debate.
2 min read
A teacher talks with seventh graders during a lesson.
A teacher talks with seventh graders during a lesson. The question of whether students should get a redo is part of a larger discussion on grading and assessment in education.
Allison Shelley for All4Ed
Assessment Grade Grubbing—Who's Asking and How Teachers Feel About It
Teachers are being asked to change student grades, but the requests aren't always coming from parents.
1 min read
Ashley Perkins, a second-grade teacher at the Dummerston, Vt., School, writes a "welcome back" message for her students in her classroom for the upcoming school year on Aug. 22, 2025.
Ashley Perkins, a 2nd grade teacher at the Dummerston, Vt., School, writes a "welcome back" message for her students in her classroom on Aug. 22, 2025. Many times teachers are being asked to change grades by parents and administrators.
Kristopher Radder/The Brattleboro Reformer via AP
Assessment Letter to the Editor It’s Time to Think About What Grades Really Mean
"Traditional grading often masks what a learner actually knows or is able to do."
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Assessment Should Students Be Allowed Extra Credit? Teachers Are Divided
Many argue that extra credit doesn't increase student knowledge, making it a part of a larger conversation on grading and assessment.
1 min read
A teacher leads students in a discussion about hyperbole and symbolism in a high school English class.
A teacher meets with students in a high school English class. Whether teachers should provide extra credit assignments remains a divisive topic as schools figure out the best way to assess student knowledge.
Allison Shelley for All4Ed