College & Workforce Readiness

States Struggle To Ensure Data Make the Grade

By Jeff Archer — January 15, 1997 9 min read
  • Save to favorites
  • Print

To many New Jersey residents, the news might seem reason to celebrate.

A report from the National Education Goals Panel last November showed the state’s high school completion rate had increased from 90 percent in 1990 to 92 percent in 1994--an apparent glimmer of hope picked up by some New Jersey newspapers.

But the news left Philip Burch, a research professor at Rutgers University in New Brunswick, scratching his head.

“I’m more than dubious about the figure,” he said.

According to a report from the New Jersey education department, the statewide graduation rate actually dropped from 85.8 percent in the 1992-93 school year to 83.5 percent in 1994-95.

So which figure is right?

As someone who’s spent years scrutinizing New Jersey’s graduation and dropout rates, Mr. Burch knows that both numbers can be right but that they’re measures of different things.

The confusion caused by the two figures in New Jersey underscores the struggle faced in nearly every state by experts trying to find comparable data on such crucial indicators as dropout and graduation rates and student test scores.

And people trying to evaluate a state’s progress often are left with apples and oranges, making any meaningful comparison difficult.

The goals panel’s completion rate, for example, draws on U.S. Bureau of the Census surveys of the number of 18- to 24-year-olds who said they had completed either high school or some equivalent. The New Jersey Department of Education figure, published in an annual report called “Vital Education Statistics,” represents the ratio of graduating students to that class’ statewide freshman enrollment.

Statisticians call the first measure a “status” rate because it asks young adults about their current educational status.

Mr. Burch is critical of the approach because the Census Bureau survey doesn’t distinguish between those who graduated with their high school class and those who dropped out and later got a General Educational Development certificate or completed an adult-education program. He and others also suspect that some survey respondents may falsely say they are high school graduates to avoid embarrassment.

Mr. Burch said he believes the state education department’s approach is the better measure, but it’s not the one the goals panel uses in its annual progress reports.

To compare states’ graduation rates fairly, the federal panel must ensure that the kind of rate used for all 50 states is the same.

But states collect and report their information in more than a dozen ways. So the goals panel uses the best comparable information it has: the “status” graduation rate reported by the Census Bureau, which surveys populations in each state the same way.

“There probably is a piece that is missing,” said Leslie Lawrence, an education associate with the goals panel. “But now there isn’t data out there that are comparable.”

‘Marginal Information’

The pressure to improve available data has built significantly as the public has demanded that its schools be held more accountable and as school reformers have sought to evaluate the success of improvement efforts.

Some education groups have responded. This month, the Council of Chief State School Officers convened a task force to help draft the kind of comparable graduation rate the National Education Goals Panel doesn’t yet have.

But many observers of education reform contend that the pressure hasn’t yielded results fast enough and that much of the information available remains incomparable or inaccurate, or isn’t yet collected.

“What we have is very marginal information, and part of the problem is that for a long time the schools didn’t want to be held accountable,” said Frank Newman, the president of the Denver-based Education Commission of the States.

“If you don’t present accurate information to the public, they’re going to lose confidence in their institutions,” said Pascal D. Forgione Jr., the commissioner of the National Center for Education Statistics. The U.S. Department of Education’s NCES is charged with collecting information on the nation’s educational progress.

Setting Standards

Though efforts are under way to improve the reporting of both dropout and completion rates, researchers say it’s unlikely that all 50 states will be on board by 2000, the deadline many states have given themselves for showing significant progress in their students’ performance.

With all of these data, the problem is one of definition.

The NCES in the 1991-92 school year set a standard definition for what a dropout is. That year, about 14 states provided information on dropout rates that fit the NCES definition, which said a dropout was someone who had been enrolled any time during the previous year but was not enrolled on Oct. 1 of the current year.

This way to measure dropout rates is called an “event” rate, because it examines just one year, although it considers all high school grade levels. An event rate would not tell you how many students dropped out over the four years of high school.

The statistics center found that many states did not enforce an Oct. 1 cutoff date. Others varied in whether they counted students who drop out over the summer as having quit school in the current or previous year. Many didn’t account for students who dropped out and then returned.

CES officials say the number of states that now provide the center with dropout rates based on the common definition has grown to 25, plus the District of Columbia.

The standardized definition of high school completion being designed by the Council of Chief State School Officers will not depend on the status rate provided by the Census Bureau.

“I think [the status rate] doesn’t really tell you the information we want,” said Barbara Clements, who directs the chiefs’ NCES-funded effort. “It doesn’t tell us what our expectations are for schools, and for most parents, that is that kids graduate on time.” She convened the task force of educators, researchers, and policymakers to “come up with a rate we can all live with,” and hopes to have a recommendation by next month.

NAEP and State Assessments

Dropout statistics are not the only area where meaningful information is hard to come by.

Open a 1996 Louisiana education department study, “Louisiana Progress Profiles: State Report,” and look at the section on student performance. As the foreword to the annual report explains, the study is meant to give “policymakers, educators, parents, and the general public a clear and concise overview of public education in Louisiana.”

There you’ll find results for students’ performance on statewide criterion-referenced tests.

The results show that at least 90 percent of the state’s 3rd and 5th graders passed both the math and language arts sections; more than 80 percent of 7th graders passed both parts.

Now look at a similar report published by the Kentucky education department. It shows that just 28 percent of the state’s 4th graders scored at the proficient level or above on its statewide reading assessment.

So which state does a better job educating its students?

The answer is anything but clear, according to Mark D. Musick, the president of the Southern Regional Education Board, an Atlanta-based interstate cooperative effort that works to improve education in member states.

Comparisons are impossible based on these reports because Louisiana and Kentucky use their own state tests, and each has its own idea of how good is good enough to pass.

Mr. Musick recently compared more than a dozen state assessment results with results on the only national ongoing assessment of how well students perform in the core academic subjects: the National Assessment of Educational Progress.

The so-called nation’s report card is a congressionally mandated test managed by the NCES. It seeks to find out how many students have achieved at least basic, proficient, or advanced levels of performance, based on expert judgments about what students in grades 4, 8, and 12 should know and be able to do.

When Mr. Musick compared each state’s NAEP results against its own assessment results, he found wide variations.

For example, in 1994, the last year the NAEP reading test was given to 4th graders, just 15 percent of Louisiana’s students scored at a proficient level, meaning they had mastered challenging content for their grade. This result was among the lowest percentages for the 39 states that took part in the NAEP reading assessment that year.

The same year, 26 percent of Kentucky’s 4th graders achieved at least a proficient score on NAEP--a higher proportion than Louisiana’s, even though the states’ own reports imply the opposite: that Louisiana’s students on average outperform Kentucky’s.

Louisiana has begun designing new assessments with the hope of adopting higher-level standards, according to Louann Bierlein, the governor’s education adviser. The state plans to begin using new tests in spring 1999.

‘Reference Points’

Mr. Musick found other wide variations in comparing several other states.

While no state he looked at had 40 percent or more scoring at the proficient level on either the NAEP reading or writing test, the states were reporting that anywhere between 11 percent and 88 percent of their students reached proficient levels on the states’ own exams.

“It appeared that NAEP was a relatively high standard and that several states were setting relatively low standards,” said Mr. Musick, who also is a member of the National Assessment Governing Board, which oversees the test.

“Overall, I would argue that the kind of things we need are more legitimate reference points,” he said. “States need to be paying more attention to what each other are doing and to the national assessments.”

What concerns him more than the fact that state and NAEP results can be so far apart is the fact that states seem to vary so much in what they consider proficient.

“What does it mean to say that in Kentucky 29 percent of its students are meeting its proficiency standard and Louisiana says 80 percent?” Mr. Musick said. “Do we believe those differences? My answer is no.”

Though the current debate over the reliability of the U.S. Consumer Price Index shows that even long-established economic measures are open to question, education experts said they would love to have the array of data available to their counterparts in economics.

As the chief education number-cruncher at the federal level, Mr. Forgione often envies the economists who generally can depend on the abundant data churned out on the nation’s economic health.

“We are playing catch-up with the education indicators,” he said. “When you realize how much work goes into just getting the CPI, we don’t have anything like that that goes in collecting education data.”

Several key pieces of education-related information are not even collected in most states, although many experts have said they would be good indicators of progress. These include the number of violent incidents in schools, the percentage of students who need remedial help in college, and the portion who have daily access to new technology.

Related Tags:

Events

Jobs Regional K-12 Virtual Career Fair: DMV
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Blueprints for the Future: Engineering Classrooms That Prepare Students for Careers
Explore how to build career-ready engineering programs in your high school with hands-on, real-world learning strategies.
Content provided by Project Lead The Way
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Cardiac Emergency Response Plans: What Schools Need Now
Sudden cardiac arrest can happen at school. Learn why CERPs matter, what’srequired, and how districts can prepare to save lives.
Content provided by American Heart Association

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

College & Workforce Readiness Inside One District’s Experiment to Anchor Learning Around Career-Ready Skills
Employers identify skills like creativity and collaboration as key to success in careers.
8 min read
An 8-year-old girl in a purple t-shirt leans over a butcher block counter inside a retrofitted school bus to glue together a map. Behind her, two classmates glue their projects.
Aiden Montanez Castro, 8, Zayne Mendez, 8, and Violet Ward, 8, work on a lesson in making a topographical map of their hometown at Fulton Elementary School in Ephrata, Pa. The Ephrata district refashioned a school bus into a Maker Bus, which parks at each of the district’s elementary schools for hands-on projects. The district has oriented its teaching around projects that allow students to demonstrate skills like empathy and creativity alongside content knowledge.
Scott Lewis for Education Week
College & Workforce Readiness Reports Work-Based Learning in Postsecondary Education: Results of a National Survey
Based on a 2025 survey, this report examines key questions about educator perspectives on work-based learning in postsecondary education.
College & Workforce Readiness Spotlight Spotlight on College and Career Pathways Designed to Serve All Students
CTE is transforming career prep: AI, high-tech training, and real-world learning connect students to in-demand jobs and future-ready skills.
College & Workforce Readiness Trump Admin. Makes Workforce Training a Focus in College-Access Program
The feds seek changes to a program designed to help low-income secondary students access higher education.
3 min read
Scranton High School student Elizabeth Kramer participates in the Program 3-D Prototyping during Luzerne County Community College's STEM Technology Day on Monday, February 17, 2020, in Nanticoke Pa. More than 100 students from four school districts will attend. The students were part of "Talent Search," an Educational Opportunity Center program. The Talent Search program identifies and assists individuals from economically disadvantaged backgrounds who have the potential to succeed in higher education.
Scranton High School student Elizabeth Kramer participates in a 3-D prototyping program at Luzerne County Community College's STEM Technology Day on Feb. 17, 2020, in Nanticoke, Pa. The students were supported by Talent Search, funded by a federal program that identifies and helps economically disadvantaged students who have the potential to succeed in higher education. The Trump administration seeks to broaden the program to include more workforce-based training.
Mark Moran/The Citizens' Voice via AP