School & District Management

Tests Reveal Varied Facets of U.S. Students’ Competitiveness

By Sarah D. Sparks — August 18, 2011 3 min read
  • Save to favorites
  • Print

As states ponder the evolution of assessments for accountability under the No Child Left Behind Act, experts are weighing in on what national and international tests can tell us about what American kids are learning.

A study released by Harvard’s Program on Education Policy and Governance and Education Next yesterday compares U.S. students who performed at or above the proficient level on the National Assessment of Educational Progress (generally dubbed the “Nation’s Report Card”) in math to the 15-year-olds tested through the Program for International Student Assessment, administered by the Organization for Economic Co-operation and Development. Researchers developed a crosswalk study of a sample of the graduating class of 2011, which participated in the 2007 NAEP as 8th graders and in the 2009 PISA as 15-year-olds.

Researchers led by Paul E. Peterson, director of Harvard University’s Program on Education Policy and Governance, found that America’s math gap is not limited to particular student groups. For example, 42 percent of non-Hispanic white students scored proficient in math, well above the 15 percent of Hispanic and 11 percent of African-American students. Yet internationally, the percentage of math-proficient American white students trailed that of all students in 16 countries, including Japan, Germany, and Canada. In Korea and Finland, the proficiency gap between white American students and all students in the countries exceeded 25 percentage points.

“The U.S. ranks number 1 in self-esteem when it comes to math and number 32 in performance,” Peterson said yesterday during a briefing on the study. “I think the U.S. has not figured out how to motivate students to learn mathematics.”

Marshall “Mike” Smith of the Carnegie Foundation for the Advancement of Teaching said he thinks policymakers should be concerned, not just by differences in America’s rankings on international tests, but by what those differences mean in terms of what is taught in American schools.

“Different tests require different generalization and transfer of knowledge,” Smith said during a discussion at the Knowledge Alliance’s annual Big Ideas conference in Queenstown, Md., last week. Students’ performance on state content assessments tracks most closely to their state standards and curricula, so improvements show up more quickly on these tests. NAEP is broader and requires more extended response to problems, but still hews closely to the subject curricula.

“There’s almost no transfer of knowledge; you’re doing what you’ve done before and what you’ve been prepared to do,” he said.

By contrast, Smith said, PISA requires students to transfer their knowledge from one subject area to another and use it in new ways.

For example, the Harvard study highlighted two sample math questions at the proficient level:

From NAEP (8th grade):
“Three tennis balls are to be stacked on top of another in a cylindrical can. The radius of each tennis ball is 3 centimeters. To the nearest whole centimeter, what should be the minimum height of the can? Explain why you chose the height that you did. Your explanation should include a diagram.” This is followed by five choices.

From PISA (15-year-olds)
“Mark (from Sidney, Australia) and Hans (from Berlin, Germany) often communicate with each other using ‘chat’ on the Internet. They have to log on to the Internet at the same time to be able to use chat. To find a suitable time to chat, Mark looked up a chart of world times and found the following: [In clock form] Greenwich, 12 Midnight; Berlin, 1 a.m.; Sydney, 10 a.m. At 7 p.m. in Sydney, what time is it in Berlin?”

In a briefing on the Harvard study, Peterson also noted that the NAEP test “is more of a pencil-and-paper test in mathematics,... whereas the PISA test is more taking real-world questions and trying to come up with answers to them.”

That should be what really keeps educators up at night, Smith said. “PISA asks you to do something different from what you’re being asked to do on these other tests: different kinds of items, a different way of structuring the items, and a different way of thinking about them. It’s not a good sign for American students not to be able to transfer their knowledge from one setting to another setting.”

A version of this news article first appeared in the Inside School Research blog.