Published Online: July 16, 2012
Published in Print: July 18, 2012, as NAEP Shows Science-Standards Mastery Long Way Off
Includes correction(s): August 27, 2012

Mastery of Science Standards Long Way Off, NAEP Suggests

As state and education leaders work on finishing a set of voluntary national standards aimed at improving the quality of science education, today's students have failed to demonstrate the deeper understanding of core science concepts those new standards prioritize.

The results from the first National Assessment of Educational Progress of both hands-on and interactive computer-based science tasks, administered in 2009 and recently released, found that students in 4th, 8th, and 12th grades performed poorly when asked to execute higher-level problem-solving and critical-thinking skills in real and simulated laboratory settings.

Additionally, results of the 2011 NAEP in science, released last month, found fewer than a third of 8th graders performing at "proficient" levels.

While those outcomes weren't that surprising, there is still concern that the push to improve science education may be more challenging than simply encouraging states to adopt uniform national science standards that call for students to improve their in-depth knowledge of science concepts, understand how they connect, and master how they can be applied to the real world. Practical challenges to implementing such changes may be exacerbated by district budget cuts and pressure to plow through material and regularly assess students.

"The results are one of the strongest arguments we have for why the Next Generation Science Standards will be important to implement, state by state. Yet great standards won't mean anything if great curriculum isn't developed," said Susan Singer, a professor of natural sciences at Carleton College, in Minnesota.

"We'll never get at the nature of science if we don't engage in effective lab learning, integrated into the flow of instruction," Ms. Singer said.

Thinking Required

Both the recent hands-on and computer NAEP in science asked students to predict what might happen in a particular scenario, make observations about what occurred in the scenarios, and explain the findings of the experiments or investigations they launched. Those questions examined how well students could conduct and reason through "real life" science situations and grasp the scientific concepts of what occurred in their investigations, according to the report from the National Center for Education Statistics, the U.S. Department of Education division that administers NAEP.

"Increasingly, graduates are called on to do things in today's world that require more than rote memory and how to follow instructions," Alan J. Friedman, a member of the National Assessment Governing Board, which sets policy for NAEP, said during a conference call about the tests. "There was no way to memorize for this test and no amount of rote drill and practice that could prepare students for it; these tests test what students can do in more complex environments and the richness of what students can do with real stuff."

About 2,000 students at each grade level were given each test and asked to complete two 40-minute hands-on tasks or three interactive computer tasks, 20 to 40 minutes in length. In an 8th grade interactive computer task, for example, students could have been asked to plan a new, simulated recreation area for a town using part of an existing wildlife area, evaluate the impact different locations for the space could have on wildlife, and determine which space would be best to build on.

On average, the students were able to accurately report what was happening in scenarios with limited data, but were challenged by manipulating multiple variables and making decisions as part of running an experiment, according to the findings. In addition, the numbers of students able to draw the right conclusions in experiments were much higher than the numbers able to provide explanations or justifications for their answers based on the findings.

Seventy-one percent of 4th graders could accurately select how volume changes when ice melts, for example, but only 15 percent could explain why that happened using evidence from the experiment.

Professional Development

The findings were fairly consistent across grade levels, other than 12th graders scoring some 15 percentage points lower than the younger students on the interactive computer tasks.

Related Blog

Timing of the results comes as a draft of the new national science standards, which are being developed by a cadre of 26 states and a team of writers led by Achieve, a Washington-based nonprofit group, are in circulation for public comment. Focused around scientific and engineering practices, cross-cutting concepts that span science disciplines, and core subject matter in science, engineering, and technology, the standards are expected to be final by early next year. They would require adopting states to invest in improving their science curricula, assessment, and teaching.

According to Gerry Wheeler, the executive director of the National Science Teachers Association, while the proposed standards are promising, lack of time, funding, materials, and teacher training present challenges, he said.

"Our science classrooms look like classrooms of 40 years ago," Mr. Wheeler said. "The science teacher needs help bringing science into their children's high-tech world."

Part of such professional development should include how to incorporate hands-on, interactive learning and labs into classroom lessons, said Arthur Eisenkraft, a professor of science education and the director of the Center of Science and Math in Context at the University of Massachusetts Boston. Too often, labs are tacked on as a supplementary piece of instruction, rather than blended into learning, said Mr. Eisenkraft, who helped develop the NAEP framework.

As a result, students are too often focused on "what’s happening” and “what does it mean" types of questions rather than on "how do we know," he said.

Others, however, maintain that there could be too much emphasis on hands-on and interactive science at the expense of improving students' core knowledge.

"We have to make sure that any standards—whether national or not—include essential science content," said Kathleen Porter-Magee, a senior director at the Washington-based Thomas B. Fordham Institute. "Process skills are important to science, but they must be thoughtfully paired with deep understanding of critical content."

'Fused Knowledge'

While the results of the recent hands-on and interactive computer tasks have some educators worried, others are less concerned.

Nancy Butler Songer, a professor of science education and learning technologies at the University of Michigan, in Ann Arbor, and alongtime researcher on improving science education, said she finds it promising that NAEP officials and national organizations like Achieve are continuing to recognize the need to change science education and build "fused knowledge," or content knowledge plus science practices.

Those current efforts are part of the necessary "pieces coming together" to improve science education, she said, which include professional development to help teachers teach science better, curriculum and standards to guide teaching, and tests to measure how well students are understanding the concepts.

"We've maintained a misconception in what it meant to know science," Ms. Songer said. "While it's taken awhile to uproot this idea, what we know now is that you can't get to a deeper level of understanding in science without working in science in a sophisticated way. You have to use models or gather and apply evidence from experiments to that concept in order to really know science. "

Vol. 31, Issue 36, Pages 8-9

You must be logged in to leave a comment. Login | Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Correction: 
An earlier version of this story inadvertently interchanged phrases, changing the meaning of a quote from Arthur Eisenkraft, the director of the Center of Science and Math in Context at the University of Massachusetts.

Most Popular Stories

Viewed

Emailed

Recommended

Commented