Statistics Agency Needs Overhaul, Panelists Assert

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments
Abolition of E.D. Center Raised as Last Option

The Education Department's statistics--gathering agency is so badly flawed that it should either be substantially overhauled or abolished, a new report by a panel of the National Research Council has recommended.

"It is past time for those in positions of responsibility to face up to the risks and dangers d perpetuating the myriad and continuing problems" of the Center for Statistics, formerly known as the National Center for Education Statistics, wrote the panel of 12 nationally prominent statisticians and educators in a report released here last week.

They said that although they "continue to believe strongly that the center still has a future," wide-ranging actions must be taken to improve its image and the quality of the data it produces.

Unless such steps are taken, the panelists concluded, "serious consideration should be given to the more drastic alternatives of abolishing the center and finding other means to obtain and disseminate education data."

Among the steps they suggested are the establishment of an office of statistical standards and methodology within the center; the setting of, and rigorous adherence to, deadlines for key surveys; and a "comprehensive" program to improve the quality and reliability of data obtained from state and local education agencies.

The panelists suggested that most of the recommended changes can be accomplished "without additional large infusions" of federal money, a point that was challenged by officials in the department.

The fiscal 1986 appropriations bill for education earmarked $8.75 million for the center, but that amount was reduced to $8.37 million as a result of the Gramm-Rudman-Hollings deficit-reduction law.

The Senate has passed a fiscal 1987 spending bill that would set the appropriation again at $8.75 million. The Reagan Administration has requested $12.4 million for the agency and $5.9 million for the National Assessment of Educational Progress, which was transferred to the center this year.

Redesign Project

The National Research Council, the principal operating agency of the National Academy of Sciences, was asked to review the center's operations by the agency's advisory council in late 1984 during former Secretary of Education Terrell H. Bell's final days in office.

Its report comes at a time when school-reform advocates are calling for accurate and timely data to justify and measure the effects of the multibillion-dollar infusion of money in the nation's schools over the past several years.

It also appears about 18 months after the center, under the direction of new leaders, initiated a major overhaul cf its data-collection system for precollegiate education, and about a year after the reorganization of the office of educational research and improvement, the center's parent office.

The panel members acknowledged the recent changes and saluted the center's director, Emerson J. Elliott, and the department's assistant secretary for research, Chester E. Finn Jr., for their "particularly realistic" outlook with regard to the center's problems.

Decades-Old Problems

The center's chief shortcomings, the panel noted, are essentially the same as those identified by other study groups over the past 30 years: Surveys are frequently "archaic" by the time they are released, and data are often inaccurate or presented in a way that make them of little use to policymakers and researchers.

The problems, the panel continued, stem directly from historically inadequate funding and a lack of skilled staff. And those problems, in turn, are attributable to the center's lack of prestige and visibility in the Congress and the Education Department itself.

"What the panel found is that there is not much significant difference from the concerns expressed in those earlier reports," said Daniel B. Levine; a fellow of the American Statistical Association and the panel's study director. "It is disturbing that the advice sought of the panels has not been acted upon."

The panel listed timeliness of dataas one of the center's chief shortcomings."Too often," it said, "the center has found itself embarrassed by anexcessive and unacceptable time lagbetween collection and release ofdata."

The report pointed out, fur example, that the vast majority of data presented in the 1985 edition of the Condition of Education are for 1983 or earlier, and in many cases the 1983 data are marked as "preliminary" or "projected."

In their report, the panel members said they were "astonished" to learn at one of their first meetings "that it was usual practice [at the center] to have received less than half of the responses [to major surveys] from the states by the cutoff date specified by the center."

"The panel was further astonished to learn that no procedures, such as an organized follow-up, existed for dealing with noncompliance" with survey requests, it continued.

Quality Control

The problems with data timeliness, the panel noted, have compounded the problems of data reliability.

The panel attributed the poor quality of of the center's data largely to their being collected mainly from administrative records maintained at the local level, which record "official," rather than "real," behavior.

In addition, they said, the data are produced by diverse record-keeping systems lacking comparability in definitions and time periods. Data are also presented at such gross levels of aggregation-such as for a state as a whole--that they are all but impossible to be checked for accuracy, consistency, and reasonableness.

The panel also found that the center lacks written statistical and methodological standards, and that unwritten standards, if any, are not known to its staff. The panel also questioned the technical competence of the staff, noting that, as of March 1985, only 11 of the agency's permanent employees were classified as mathematical statisticians, and none was located in the office of the director.

Recommendations

The panel listed 43 separate suggestions for improving the center's operations. Among its main recommendations are that the center:

  • Get clear support from the Congress and the Secretary of Education as demonstrated by their budget actions.
  • Establish an office of statistical standards and appoint a chief statistician.
  • Establish, publish, and adhere to a set of fixed release dates for selected key education statistics.
  • Undertake with the states an analysis to ensure its programs meet requirements for usefulness, relevance, quality, and reliability.
  • Initiate a "sample-based program of data collection focused on individual classrooms and students to facilitate better understanding of the relationships between educational inputs, processes, and outcomes."
  • Develop standards to guide conduct of all phases of its work.
  • Initiate a comprehensive program to assess quality, consistency, and reliability of data from local agencies.
  • Recruit professional and technical staff in addition to evaluating current staff.

Education Department officials and Congressional staff members were generally supportive of the panel's findings, but questioned its assertion that major changes could be made without a substantial new investment of federal dollars.

"Obviously we don't agree with this, as wejust requested a substantial increase for programmatic changes at the center," said Mr. Finn, the assistant secretary for research. "We are disappointed we couldn't persuade anyone to give us more money. Data don't come for free."

Mr. Elliott, the center's director, pointed out that while he agreed with most of the panel's recommendations, many of the changes were already being made.

"For example, we recently hired a chief statistician, and sample-based evaluations are under way," he said.

While defending the center's current performance as one where "things are changing sharply," Mr. Elliott acknowledged that "the information we have now is out of date."

"What we have now doesn't help us understand new legislation," he said. "Every place I go I run into people who have to make education decisions. Sometimes they're in education, and often they are not, but they have been charged with making decisions in the momentum of education reform.

"There is enormous pressure for more data from a much broader swath of the public than ever before. All kinds of issues have been raised, and political futures of lots of people are at stake. They need to know what happened as a result of this legislation."

The panel noted, and several people familiar with its operations agreed, that the center has not been very successful in garnering support from the Congress, the Office of Management and Budget and, indeed, from the Education Department itself.

"There's plenty of blame to go around," said John F.Jennings, majority counsel of the House Subcommittee on Elementary, Secondary, and Vocational Education.

Nevertheless, Mr. Jennings placed a large share of the blame on the Education Department. He agreed that the center had been "a stepchild" of the Congress, but argued that this stemmed at least in part from the perception that the center is ''buried'' in the department, and that its staff has little or no control over its activities.

The members of the National Research Council panel are:

Vincent P. Barabba, chairman, executive director, market research and planning, General Motors Corporation; Anthony Bryk, Department of Education, University of Chicago; Roberto M. Fernandez, Department of Sociology, University of Arizona; Christopher Jencks, Center for Urban Affairs, Northwestern University; F. Thomas Juster, Institute for Social Research, University of Michigan; Stephen Kaagan, Commissioner of Education, State of Vermont; G. Graham Kaiton, Institute for Social Research, University of Michigan; Alexander Law, Department of Education, State of California; Richard Light, Kennedy School of Government and the Graduate School of Education, Harvard University; Shirley Malcom, American Association for the Advancement of Science, Washington, D.C.; Andrew Potter, Institute for Research on Teaching, Michigan State University; and Wray Smith, Harris-Smith Research Inc., Arlington, Va.

Vol. 06, Issue 04, Pages 1, 17

Notice: We recently upgraded our comments. (Learn more here.) If you are logged in as a subscriber or registered user and already have a Display Name on edweek.org, you can post comments. If you do not already have a Display Name, please create one here.
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories

Viewed

Emailed

Recommended

Commented