Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments

Groups advocating national standards and a related system of assessments have suggested that states could develop their own examinations, linked to the standards, that produce results capable of being compared, or "calibrated.''

But because different exams are designed differently and measure different things, this solution may not be practical, the E.T.S. report suggests.

"It isn't possible to construct one-and-for-all correspondence tables to 'calibrate' whatever assessments might be built in different clusters of schools, districts, or states,'' the report concludes.

However, it argues, some less ambitious ways of comparing student performance on different exams would be possible.

Copies of the report, "Linking Educational Assessments,'' are available for $6.50 each, prepaid, from the Policy Information Center, Educational Testing Service, 04-R, Princeton, N.J. 08541.

To provide educators in the United States with a glimpse of the type of mathematics performance top Japanese students are asked to demonstrate, the Mathematical Association of America has published a set of university-entrance examination questions.

The short-answer, machine-scored questions are taken from the University Entrance Center Examination, a test required for students applying to public universities, which enroll about 30 percent of Japan's college students.

In 1990, students answered about 70 percent of the questions correctly.

Copies of "Japanese University Entrance Examination Problems in Mathematics'' are available for $7.50 each from the M.A.A., 1529 18th St., N.W., Washington, D.C. 20036.

The reading test used in New York City schools provides an inaccurate and incomplete picture of students' reading abilities, a study concludes.

Based on classroom data from 61 teachers in 18 elementary schools, the study found that children with the lowest scores on the city's Degrees of Reading Power test could in fact read books that were substantially more difficult than their test scores indicated.

The findings suggest that "no decisions about individual students or about schools should be made solely on the basis of any standardized multiple-choice test given once per year,'' said Beth J. Lief, the executive director of the Fund for New York City Public Education, which jointly conducted the study with the New York public schools.--R.R.

Vol. 12, Issue 17

Notice: We recently upgraded our comments. (Learn more here.) If you are logged in as a subscriber or registered user and already have a Display Name on edweek.org, you can post comments. If you do not already have a Display Name, please create one here.
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories