Accountability

A New Accountability Player: The Local Newspaper

By Lynn Olson — June 17, 1998 10 min read
  • Save to favorites
  • Print

This spring, the Detroit Free Press announced that it would no longer rank schools and districts based simply on scores from statewide tests.

The newspaper reached its verdict after conducting a six-month computer analysis of results from the Michigan Educational Assessment Program. It found that poverty and other factors outside a school’s control were so strongly linked to test scores that it made straight-up comparisons “inevitably flawed” and “mostly meaningless.”

“I think we realized, with some embarrassment, that we never had any business ranking districts based on MEAP scores,” said Tracy Van Moorlehem, the paper’s K-12 education reporter. “It’s just not fair, nor really particularly accurate.”

About This Series

Instead, the Free Press vowed that from now on it will produce a more nuanced picture of how well Michigan’s schools are doing given the challenges they face. With that commitment, the Motor City daily joins a growing number of other newspapers that are investing heavily in time and resources on special reports on education that go far beyond the mere reporting of test scores.

Many, like the Free Press, are using sophisticated computer techniques to delve into educational data. The Charlotte Observer in North Carolina, the Arkansas Democrat Gazette in Little Rock, The Seattle Times, and The Philadelphia Inquirer, to name a few, now produce regional report cards on schools.

The newspapers’ reports often surpass the documents produced by states and districts in their level of detail, sophistication, and accessibility, and most are available on the World Wide Web.

A Cautious Welcome

For educators, who are already pressured on several fronts by demands for greater accountability, these reports create both new challenges and new opportunities.

Many teachers and administrators say they welcome the potential for a deeper, more complete picture of education. But they also worry that some of the analyses may be as misleading or incomplete as the raw rankings they replaced.

“I congratulate any newspaper that attempts to put some of this stuff into context,” said Linda Leddick, the director of research, evaluation, and assessment for the Detroit public schools. “For too long, they’ve been running test results as if they’re scores in a horse race, and that does not help the public understand.”

But, she cautioned, “these kinds of articles can be particularly dangerous if newspapers are just going to start throwing test data into hoppers and doing projections and not being careful with them.”

‘Basic Public Service’

The news media’s penchant for producing report cards stems, in part, from the public’s hunger for information about schools.

“Many people today shop for school districts as intensively as they shop for homes,” said Neill Borowski, the director of computer-assisted reporting and analysis for The Philadelphia Inquirer, which published its first report card last September.

The public’s appetite for information comes as newspapers have greatly expanded their ability to sift through large amounts of data--thanks to the advent of personal computers, Web sites, and computer-savvy reporters and editors like Mr. Borowski.

Last year, the Education Writers’ Association, a Washington-based professional organization of reporters, held a session on report cards at its annual conference.

The National Institute for Computer-Assisted Reporting, which trains journalists in how to analyze databases, regularly focuses on school reports as part of its training sessions. “Almost every major regional paper now is making some attempt to look at school reports on their own,” said Sarah Cohen, the training director for the group, based in Columbia, Mo. “I think they consider it a basic public service.”

Since 1996, The Seattle Times has published an annual report on schools in its region. This year, the 256-page book includes statistics on more than 530 public and private schools. Readers can also access the information on the Web.

In a few seconds, parents can determine which high schools push their students to take algebra, or which schools assign at least three hours of homework a night.

‘Didn’t Exist’

The Charlotte Observer‘s report card includes snapshots of about 500 public schools in its area, including such “top-20 lists” as which schools have the most 3rd graders reading at or above grade level and which have improved the most on the state’s U.S. history test.

Last month, The Los Angeles Times published a special series on California’s 8,000 public schools that combined information from dozens of databases. Among its findings: Dropout rates are down, and students from all racial and ethnic groups are taking more college-preparatory courses than in the past. And the newspaper found more than 1,000 schools that failed to move a single student out of bilingual education last year.

Eva Baker, the co-director of the Center for Research on Evaluation, Standards, and Student Testing at the University of California, Los Angeles, which worked with the Times on the series, said such reports meet a public need.

“People are pretty cynical,” she said, especially about government agencies. “I think they believe that the newspapers are not likely to whitewash something.”

Some newspapers are going beyond test scores and databases to conduct surveys and poll their readers. The Seattle Times and The Philadelphia Inquirer, for example, send questionnaires to school districts, either to collect information that the state does not or to publish it in a more timely fashion.

Newspapers also combine databases to make comparisons across districts or schools, which often aren’t available elsewhere. The Los Angeles Times merged data sets from the state education department and the University of California and California State University systems.

“What we needed to do was to put together something that didn’t exist,” said Richard O’Reilly, the newspaper’s director of computer analysis. The resulting combined database “pulled together all of the data from the three sources into a single record per school per year.”

Leveling the Field

To measure the effects of poverty and other nonschool factors on achievement, the Free Press and other newspapers use such sophisticated statistical techniques as multiple regression analysis.

Such methods can determine to what extent variations in test scores are related to differences in such factors as family income, student mobility, or limited English proficiency. The findings are used to create projections of likely test results for a school or district based on its student population.

Schools or districts whose actual test scores are much better than predicted are judged to be particularly effective at serving their students.

Based on its study, the Free Press concluded that the Detroit public schools were beating the odds, while some wealthier suburbs could be doing more. Similarly, the Omaha World-Herald last year identified 10 elementary schools that had done far better over the preceding five years than their demographics would have predicted.

In 1996, the Texas Monthly magazine rated 3,172 elementary schools in the state based on a combination of test scores and the percent of children in a free or reduced-price lunch program.

The rating system, which is different from that used by the state, has been criticized by Texas officials as creating confusion. But Gregory Curtis, the editor of the magazine, disagrees.

“We stand behind it,” he said. “It’s simple. It’s straightforward.” And, he added, such information coming from a statewide publication “has a much greater impact than a report from the bureaucracy of the schools.”

‘A Huge Impact’

Many educators praise the attempts by newspapers to put test scores in a larger context.

“I was very pleased that somebody was going to look at something besides raw data and consider the other factors that we deal with every day in schools,” said Jim Anderson, a principal at Floyd Elementary School in Midland, Mich., about the Free Press analysis. The newspaper concluded that his district, Bullock Creek, did about as predicted based on its demographics.

But newspapers face many of the same problems as education researchers: Are they controlling for the right variables? Are they using the most appropriate statistical techniques? Are they reaching premature conclusions, or inferring causal relationships where none exist?

When the World-Herald published its analysis, John Langan, the Omaha school board president, told the newspaper that he feared its identification of some schools as “underperforming” would hurt students, neighborhoods, and teacher recruitment.

The Free Press also has been criticized by some researchers for what they say is an overly simplistic analysis of the nonschool factors that affect test scores. Shawn M. Quilter, an assistant professor of education at Eastern Michigan University, said the newspaper’s report “has a huge impact on administrators and teachers.” Though it is just an analysis, he added, educators “take it as authority and fact.”

Making It Public

Journalists counter that too many states and districts shy away from painful comparisons of schools and districts. Or they conduct such analyses and fail to publish the results.

For years, the World-Herald fought with Omaha school officials over the district’s refusal to provide test scores in a way that the paper could analyze them. Once the newspaper obtained the data, “we learned that the district does the precise type of analysis that we were doing to watch its schools ... but never made that public,” said Carol Napolitano, a staff writer who handles computer-assisted reporting for the paper.

The Sun newspaper in Baltimore publishes a report card that profiles elementary schools in its metropolitan area. While the information is on the public record, said Mike Himowitz, the paper’s electronic-news editor, “the state has always seen fit to publish this data in a way that doesn’t make it easy to compare one school with another, which is why our reports are so popular.”

Other journalists praise the cooperation of state education officials. “The state people who maintain all the data were actually thrilled that somebody was interested in using it,” said Bill Ristow, the education editor for The Seattle Times.

A ‘Tricky Area’

But educators and journalists alike warn that newspapers embarking on computer analyses must invest the time and money to get it right. Many of the journalists interviewed for this story had spent six months to a year on such projects and had hired expert consultants to look over their shoulders or conduct some of their analyses. The Philadelphia Inquirer has 68 people working on its report cards for New Jersey and Pennsylvania this year.

Laurence T. Ogle, a statistician with the National Center for Education Statistics, an arm of the U.S. Department of Education, says the trend toward such reports is a good one. “But you also have to be a little cautious and make sure people really know what they’re doing when they do statistical analyses.”

Heather Newman, the specialist in computer-assisted reporting for the Free Press, agreed. “I think that this is a real tricky area for newspapers to get into,” she said. “For people who haven’t had an adequate education in the proper use of statistics, it’s really easy to come up with a bunch of numbers and then to make some meaning out of them.”

Numbers Not All

Even under the best circumstances, Ms. Napolitano of the Omaha World-Herald cautioned, “I don’t think the numbers can be the central story.”

One of the strengths of newspapers is their ability to supplement data with more traditional reporting. For its report, the World-Herald sent five reporters into a dozen schools for two weeks to visit classrooms and to interview parents, students, and educators. “The more we learned about test scores and learned their limitations, the more we felt the need to get into classrooms,” said Mike Reilly, the newspaper’s projects editor.

The Philadelphia Inquirer incorporates “points of pride” into its report cards, in which districts identify things they are doing that stand out. “As much as we love the numbers and love the data analysis,” Mr. Borowski said, “there’s a lot of things you can’t capture with the numbers.”

A version of this article appeared in the June 17, 1998 edition of Education Week as A New Accountability Player: The Local Newspaper

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Accountability Sponsor
Demystifying Accreditation and Accountability
Accreditation and accountability are two distinct processes with different goals, yet the distinction between them is sometimes lost among educators.
Content provided by Cognia
Various actions for strategic thinking and improvement planning process cycle
Photo provided by Cognia®
Accountability What the Research Says More than 1 in 4 Schools Targeted for Improvement, Survey Finds
The new federal findings show schools also continue to struggle with absenteeism.
2 min read
Vector illustration of diverse children, students climbing up on a top of a stack of staggered books.
iStock/Getty
Accountability Opinion What’s Wrong With Online Credit Recovery? This Teacher Will Tell You
The “whatever it takes” approach to increasing graduation rates ends up deflating the value of a diploma.
5 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Accountability Why a Judge Stopped Texas from Issuing A-F School Ratings
Districts argued the new metric would make it appear as if schools have worsened—even though outcomes have actually improved in many cases.
2 min read
Laura BakerEducation Week via Canva  (1)
Canva