Assessment

Data Reanalysis Finds Test-Score Edge for Private Schools

By Mary Ann Zehr — August 08, 2006 3 min read
  • Save to favorites
  • Print

Harvard University researchers publicized findings last week calling into question the methodology of recent studies finding that students at public schools did as well as or better than their private school peers on some standardized tests when scores were adjusted for certain student characteristics.

Paul E. Peterson, a professor at Harvard’s John F. Kennedy School of Government, found that when he and graduate student Elena Llaudet reanalyzed data from the National Assessment of Educational Progress using different variables to adjust for student characteristics, students at private schools came out on top of those in public schools in almost all areas.

That conclusion was nearly the opposite of a study recently released by the U.S. Department of Education, as well as an earlier study by two University of Illinois professors. (“Public Schools Fare Well Against Private Schools in Study,” July 26, 2006.)

“On the Public-Private School Achievement Debate” is available from the Program on Education Policy and Governance at Harvard University.

In all three studies, researchers adjusted for characteristics such as race and socioeconomic status, but they based the adjustment on different information that had been reported to NAEP.

Mr. Peterson said that none of the three studies can conclude with any confidence that one group of schools does better than the other, because the NAEP data provide only a snapshot of how students did on tests at one point in time, rather than what they learned over a period of time.

“We aren’t offering this study as definitive evidence,” Mr. Peterson said. “We’re offering it as strong evidence that the methods used by the other two studies are defective.”

Henry Braun, the senior author of the NCES report and a senior educational researcher for the Educational Testing Service in Princeton, N.J., acknowledged that Mr. Peterson has raised some important issues regarding the variables used in the NCES study. But he said the variables used by Mr. Peterson are equally problematic.

“Because of the variables he’s using, it may be that he is underadjusting for disadvantage in the public school sector,” he said.

Christopher Lubienski and Sarah Theule Lubienski, a husband-and-wife research team at the University of Illinois at Urbana-Champaign who published a study in January that Mr. Peterson is revisiting, argued in an interview last week that the variables chosen by Mr. Peterson are flawed and inferior to the ones they used.

Different Classifications?

The Harvard team relied largely on information about student characteristics reported by the students themselves, rather than information reported by public and private school administrators. Mr. Peterson contends that in comparisons of public and private schools, data reported by administrators based on their schools’ participation in federal programs, such as the federal subsidized lunch program, is not reliable because both kinds of schools have very different involvement in those programs and classify their students in different ways.

Ms. Lubienski acknowledged that classification differences between public and private schools pose a problem. But she argued that the Harvard team is “throwing the baby out with the bath water” to exclude data such as whether students are identified as having limited proficiency in English or have individualized education programs when controlling for student background.

She said some of the variables Mr. Peterson accounts for also have flaws. He controls for the education level of students’ parents, for example, which Ms. Lubienski sees as a problem because some 4th graders who reported that information to NAEP likely don’t know their parents’ education levels.

Mr. Braun added that Mr. Peterson’s use of parental education to adjust for socioeconomic level is flawed because he didn’t account for such nuances as whether both parents or only one has a college education.

The federal study, released July 14 by the National Center for Education Statistics, found that when data are adjusted for student characteristics, 4th and 8th grade public school students perform as well as or better than private school students in reading and math, with the exception of 8th grade reading, where children in private schools do better than their public school peers.

Those results were similar to those found by the Lubienskis, though they looked only at NAEP math scores.

Related Tags:

A version of this article appeared in the August 09, 2006 edition of Education Week as Data Reanalysis Finds Test-Score Edge for Private Schools

Events

Jobs October 2021 Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Data Webinar
Using Integrated Analytics To Uncover Student Needs
Overwhelmed by data? Learn how an integrated approach to data analytics can help.

Content provided by Instructure
Professional Development Online Summit What's Next for Professional Development: An Overview for Principals
Join fellow educators and administrators in this discussion on professional development for principals and administrators.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment State Test Results Are In. Are They Useless?
While states, districts, and schools pore over data from spring 2021 tests, experts urge caution over how to interpret and use the results.
9 min read
FILE - In this Jan. 17, 2016 file photo, a sign is seen at the entrance to a hall for a college test preparation class in Bethesda, Md. The $380 million test coaching industry is facing competition from free or low-cost alternatives in what their founders hope will make the process of applying to college more equitable. Such innovations are also raising questions about the relevance and the fairness of relying on standardized tests in admissions process.
A sign is posted at the entrance to a hall for a test-preparation class. Assessment experts say educators should use data from spring 2021 tests with caution.
Alex Brandon/AP
Assessment Data Young Adolescents' Scores Trended to Historic Lows on National Tests. And That's Before COVID Hit
The past decade saw unprecedented declines in the National Assessment of Educational Progress's longitudinal study.
3 min read
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Proven Techniques for Assessing Students with Technology
Dr. Doug Fisher’s proven assessment techniques help your students become active learners and increase their chances for higher learning g...
Content provided by Achieve3000
Assessment Long a Testing Bastion, Florida Plans to End 'Outdated' Year-End Exams
Florida Gov. Ron DeSantis said the state will shift to "progress monitoring" starting in the 2022-23 school year.
5 min read
Florida Governor Ron DeSantis speaks at the opening of a monoclonal antibody site in Pembroke Pines, Fla., on Aug. 18, 2021.
Florida Gov. Ron DeSantis said he believes a new testing regimen is needed to replace the Florida Standards Assessment, which has been given since 2015.
Marta Lavandier/AP