Assessment

Data Reanalysis Finds Test-Score Edge for Private Schools

By Mary Ann Zehr — August 08, 2006 3 min read
  • Save to favorites
  • Print

Harvard University researchers publicized findings last week calling into question the methodology of recent studies finding that students at public schools did as well as or better than their private school peers on some standardized tests when scores were adjusted for certain student characteristics.

Paul E. Peterson, a professor at Harvard’s John F. Kennedy School of Government, found that when he and graduate student Elena Llaudet reanalyzed data from the National Assessment of Educational Progress using different variables to adjust for student characteristics, students at private schools came out on top of those in public schools in almost all areas.

That conclusion was nearly the opposite of a study recently released by the U.S. Department of Education, as well as an earlier study by two University of Illinois professors. (“Public Schools Fare Well Against Private Schools in Study,” July 26, 2006.)

“On the Public-Private School Achievement Debate” is available from the Program on Education Policy and Governance at Harvard University.

In all three studies, researchers adjusted for characteristics such as race and socioeconomic status, but they based the adjustment on different information that had been reported to NAEP.

Mr. Peterson said that none of the three studies can conclude with any confidence that one group of schools does better than the other, because the NAEP data provide only a snapshot of how students did on tests at one point in time, rather than what they learned over a period of time.

“We aren’t offering this study as definitive evidence,” Mr. Peterson said. “We’re offering it as strong evidence that the methods used by the other two studies are defective.”

Henry Braun, the senior author of the NCES report and a senior educational researcher for the Educational Testing Service in Princeton, N.J., acknowledged that Mr. Peterson has raised some important issues regarding the variables used in the NCES study. But he said the variables used by Mr. Peterson are equally problematic.

“Because of the variables he’s using, it may be that he is underadjusting for disadvantage in the public school sector,” he said.

Christopher Lubienski and Sarah Theule Lubienski, a husband-and-wife research team at the University of Illinois at Urbana-Champaign who published a study in January that Mr. Peterson is revisiting, argued in an interview last week that the variables chosen by Mr. Peterson are flawed and inferior to the ones they used.

Different Classifications?

The Harvard team relied largely on information about student characteristics reported by the students themselves, rather than information reported by public and private school administrators. Mr. Peterson contends that in comparisons of public and private schools, data reported by administrators based on their schools’ participation in federal programs, such as the federal subsidized lunch program, is not reliable because both kinds of schools have very different involvement in those programs and classify their students in different ways.

Ms. Lubienski acknowledged that classification differences between public and private schools pose a problem. But she argued that the Harvard team is “throwing the baby out with the bath water” to exclude data such as whether students are identified as having limited proficiency in English or have individualized education programs when controlling for student background.

She said some of the variables Mr. Peterson accounts for also have flaws. He controls for the education level of students’ parents, for example, which Ms. Lubienski sees as a problem because some 4th graders who reported that information to NAEP likely don’t know their parents’ education levels.

Mr. Braun added that Mr. Peterson’s use of parental education to adjust for socioeconomic level is flawed because he didn’t account for such nuances as whether both parents or only one has a college education.

The federal study, released July 14 by the National Center for Education Statistics, found that when data are adjusted for student characteristics, 4th and 8th grade public school students perform as well as or better than private school students in reading and math, with the exception of 8th grade reading, where children in private schools do better than their public school peers.

Those results were similar to those found by the Lubienskis, though they looked only at NAEP math scores.

Related Tags:

A version of this article appeared in the August 09, 2006 edition of Education Week as Data Reanalysis Finds Test-Score Edge for Private Schools

Events

School Climate & Safety K-12 Essentials Forum Strengthen Students’ Connections to School
Join this free event to learn how schools are creating the space for students to form strong bonds with each other and trusted adults.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Equity and Access in Mathematics Education: A Deeper Look
Explore the advantages of access in math education, including engagement, improved learning outcomes, and equity.
Content provided by MIND Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Opinion What's the Best Way to Grade Students? Teachers Weigh In
There are many ways to make grading a better, more productive experience for students. Here are a few.
14 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty
Assessment Spotlight Spotlight on Assessment
This Spotlight will help you evaluate effective ways to offer students feedback, learn how to improve assessments for ELs, and more.
Assessment Opinion To Replace Skill Mastery for Seat Time, There Are 3 Requirements
Time for learning and student support take on a whole new meaning in the mastery-based learning model.
4 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment More States Could Drop Their High School Exit Exams
There's movement afoot in nearly half the states that still mandate high school exit exams to end the requirement.
4 min read
A student looks at questions during a college test preparation class at Holton Arms School in Bethesda, Md., on Jan. 17, 2016. The SAT exam will move from paper and pencil to a digital format, administrators announced Tuesday, Jan. 25, 2022, saying the shift will boost its relevancy as more colleges make standardized tests optional for admission.
A student looks at questions during a college test preparation class at Holton Arms School in Bethesda, Md., on Jan. 17, 2016. More states are looking to abandon high school exit exams as support for standardized testing cools.
Alex Brandon/AP