Testing Smarts: Boston Schools Pilot Urban Assessments
BOSTON--Janan Bristow and Latoya Bacchus, 4th graders at William Monroe Trotter Elementary School here, giggle as they retrieve beads that have fallen to the cafeteria floor.
Working with a calculator, the students are trying to figure out how many beads they could string to form a bracelet worth less than $8.
Then, in a separate experiment, they manipulate geometric shapes to determine their value.
On the other side of the cafeteria, meanwhile, other 4th graders are banging glasses with sticks to see the effect on pitch of filling the glasses with water. Downstairs, another group of students is writing essays about a videotape they have just seen, while a fourth group is retelling stories in their own words into a tape recorder.
All of these students observed here on a Friday late last month were taking a test--one they agreed was much more fun than the tests they were used to taking. But they also recognized that the test had a more serious purpose, and that this aspect, too, differed from their usual tests.
"It shows how smart you are,'' said Janan.
To the Boston teachers and researchers from Boston College's Center for the Study of Testing, Evaluation, and Educational Policy, who put together the new assessment, Janan's view of the test's goal is precisely the point.
Their aim is to come up with a method for urban districts to improve the way they measure student abilities. Now being piloted in seven Boston elementary and middle schools, including Trotter, the assessment is part of a project, led by Boston College and the American Federation of Teachers, that involves 12 cities across the country.
If they are successful, project officials say, they will produce assessments that more accurately tap a range of student abilities than do traditional tests. In a National Science Foundation-funded report released last fall, the Boston College center found that few questions on commercially available standardized mathematics and science tests measured the conceptual knowledge and problem-solving skills reformers are advocating. (See Education Week, Oct. 21, 1992.)
"Those tests 'dumb down' kids,'' Maryellen C. Harmon, a research associate at Boston College, said. "Now, we're changing the tests to 'smarten them up.'''
But the researchers concede that, even if they can successfully develop the new measures, urban schools pose special challenges for implementing them.
For example, they point out, the high turnover rate of administrators in inner-city schools makes it difficult to sustain innovation.
And, they note, funding problems limit teachers' ability to obtain the training needed to put the new assessments in place. In fact, teachers from at least three of the cities involved in the assessment project were forced by tight budgets to pull out of a summer institute Boston College is holding next month for the members.
But despite these problems, officials from the participating cities say the project is a worthwhile addition to local and national efforts to revamp student assessment.
"The more familiarity teachers have with this kind of assessment, the broader their exposure to the instruction we think is important,'' said Karen Scholnick, an administrative assistant in the office of accountability and assessment in the Philadelphia school district, one of the consortium members.
Began as Local Effort
Though now a national effort, the assessment project began here in Boston as a purely local concern.
It started in 1988, when leaders of the Boston Compact, a partnership involving the public schools and local businesses and universities, asked George F. Madaus, the director of the Boston College testing center, to develop a new way of measuring school performance.
The demand for a new measurement system was reinforced the following year, when the school board and the Boston Teachers' Union signed a teachers' contract that called for stricter accountability for schools in exchange for greater authority at the school site.
But in developing the new measures, the Boston officials quickly realized that a national effort was needed, according to Jeannette Hargroves, a public-policy analyst with the Federal Reserve Bank of Boston and a member of the Boston Compact's measurement committee.
"We began to see that, not only does Boston have a problem, but other districts are not any farther along,'' she said.
Teaming up with the A.F.T., which agreed to serve as a communications forum for the project, the researchers formed the Urban District Assessment Consortium and secured grants from the Boston Foundation, the Pew Charitable Trusts, and the John D. and Catherine T. MacArthur Foundation.
The consortium allows districts that are involved in creating new forms of assessments to share ideas and to tap into the expertise of researchers at Boston College and elsewhere. Because of its proximity to B.C. and a history of working relationships, the Boston school district agreed to serve as the initial test site for a new assessment system.
Robert Pearlman, the director of research for the Boston Teachers' Union, said the assessment consortium was a natural outgrowth of the union's Urban District Leadership Consortium, a group of leaders of reform-minded districts.
"The issue of how to do assessment differently is a critical component of anybody's reform efforts,'' he said.
Working with local teachers, the Boston College researchers developed an assessment that was tried out in Trotter Elementary and six other schools here this school year.
It consists of three parts in four subject areas--reading, writing, math, and science. The components, each of which takes 45 minutes to complete, are:
- Questions from the National Assessment of Educational Progress;
- Open-ended questions, both short-answer and long-answer, including some modified NAEP questions; and
- The performance tasks, such as the ones tried out last month at Trotter Elementary .
In addition to their answers, the students are evaluated on whether they show evidence of interpreting their results, of working in groups, and other factors.
The student responses will be scored this summer by teams from each school, consisting of teachers, parents, and community representatives.
But the Boston College researchers caution that they have yet to determine whether the assessment will produce sufficiently valid and reliable results, a major concern in the use of alternative forms of assessment.
"I think we're at the stage now,'' said John J. Cawthorne, a senior research associate at Boston College, "where we will find out what is reasonable to reduce to numbers, and what is not.''
But Mr. Cawthorne also noted that, whatever the outcome, the assessment will produce scores only for schools, not for individual students. The assessment was administered using a "matrix sampling'' technique, similar to one used by NAEP, in which each student takes only a portion of the overall test.
In fact, the purpose of the program is to provide information on school programs, not on individual student progress, according to Ms. Hargroves.
"If you want schools to improve, it's much better to use the school as the unit of measure,'' she said.
'What Do We Do?'
Muriel Leonard, the principal of the Trotter school, said the pilot suggested that the assessment will be a boon for her inner-city students, many of whom come from low-income and troubled backgrounds.
Unlike in other tests, she noted, the pupils taking the new assessment were engaged in their work, not disruptive.
"I would think the more active, hands-on the instruction is, the more focused the children are on the task, the fewer the incidents of boredom and acting out,'' Ms. Leonard said.
But Joseph J. Pedulla, the co-director of the project, said the pilot also demonstrated that few teachers currently teach in the way the assessment demands. When asked to set up experiments or to explain their answers, he said, the students appeared at a loss.
"They don't have the experience,'' Mr. Pedulla said. "They say, 'What do we do?'''
Retraining teachers so that they are able to help their students do well on the assessment is a daunting task, particularly in urban districts that are chronically strapped for funds, said Jan Link, the director of testing, evaluation, and research for the San Francisco Unified School District, which is part of the consortium.
"Where is the massive amount of training to come from for teachers to reframe the way they teach?'' she asked.
The consortium itself provides one avenue for teachers, Ms. Link added, by offering them a network of teachers involved in developing new forms of assessment. And the project's summer institutes--the second is scheduled for next month--offer opportunities to work with academic experts.
But Mr. Cawthorne of Boston College pointed out that, because of tight budgets, teachers from Minneapolis, New York City, and Rochester, N.Y., will be unable to attend next month's institute.
"If you add one more thing on [to teachers' responsibilities],'' he said, "the rubber band's going to break.''
Not Jumping on Bandwagons
Mr. Cawthorne added that the turnover of administrators in urban districts also makes implementing the alternative assessments difficult.
With the support of top officials in Boston, he pointed out, the schools participating in the pilot received waivers of requirements to administer the district's criterion-referenced test. But such policies could change if the leaders move on, he said.
"If we get a new superintendent, will we go back to square one? I don't know,'' Mr. Cawthorne said.
Maryellen Donohue, the director of research for the Boston district, said that, because of the difficulties in putting together the new system, the assessment consortium is doing the right thing by taking its time to iron out the wrinkles.
Next year, the project plans to expand the pilot to include more schools in Boston and to add schools in other cities as well, officials said. The officials are negotiating, for example, with two schools in Philadelphia to try out the assessment there, according to Ms. Scholnick of the Philadelphia district.
But, she added, district officials will take the time to work with teachers before putting a new assessment system in place districtwide.
"We are hoping that, by not jumping on bandwagons too quickly, we'll
have a more solid program down the pike,'' Ms. Scholnick
Vol. 12, Issue 38