Teacher Board Providing Valuable Lessons in Using Portfolios
This is the fifth story in an occasional series that will examine trends in assessment.
In aiming to measure the richness of good teaching, the National Board for Professional Teaching Standards has learned some expensive lessons. One of them proves again the adage that less is more.
The portfolios that candidates for national certification compiled from their work over several weeks, it turns out, overwhelmed both teachers and the assessors who had to read them.
And despite the dozens of pages of commentary and explanation that some teachers churned out, the portfolios did not yield enough information to be worth that kind of effort.
"You only get a glimpse of candidates' practice with these huge exercises," said James R. Smith, the senior vice president of the national board.
A teacher with five classes a day, for example, had to focus on only one for the certification process in the field of early adolescence/generalist. That package of assessments--which in January yielded the national board's first round of nationally certified teachers--included an exercise called "Teaching and Learning."
Candidates chose one class and wrote a narrative of the teaching and learning process over three weeks. They also included samples of students' work and a videotape of the class.
That exercise alone took four hours to score for each candidate, Mr. Smith said, but ended up capturing a relatively narrow slice of a teacher's job.
Despite the shortcomings, experts have praised the national board's exercises as a leap forward in using performance assessments to evaluate teachers.
"There is no question in my mind that they are dramatically better than anything else out there," said Lee Shulman, a professor of education and psychology at Stanford University who is helping the privately organized board rethink some problems with its assessments.
Part of the power of those assessments lies in the view of teaching that has emerged from the national board's standards. Teachers are expected to be reflective about their practice, for example, and oriented toward helping their students think critically.
But in asking teachers to write about their teaching, assessment developers learned that reflection, however desirable, does not come easily to teachers.
"They are not very good at it, they are not trained to do it, and they don't do it normally," Mr. Smith said. "They can do a pretty good job describing and telling you what happened, but most don't do a good job analyzing."
Other assessments now under development will try to address these problems. Instead of asking teachers to think in general terms about their practice, they would be asked a series of open-ended, but focused, questions.
High school mathematics teachers, for example, might be asked to produce a 20-minute, uninterrupted videotape of themselves teaching students, in a large-group setting, to reason and think mathematically. Then they would respond to questions about what came before and after the lesson.
Another exercise might question teachers about how they would work with small groups of students on a different type of math problem.
The point would be, Mr. Smith explained, to capture samples of different kinds of teaching in different classrooms that would paint a fuller picture of a teacher's work.
"It's harvesting their practice," he said, "rather than requiring them to do things they're not used to doing."
Focus and Limits
Mari Pearlman, a member of the task force helping the board rethink its work and one of the developers of the mathematics assessments at the Educational Testing Service, said developers have learned that they need to be specific with candidates.
"There was an original sense that if you were talking to professionals, you shouldn't be directive," she said. "One of the things we've learned is you have to be directive. Teachers say, 'Just say exactly what is it I should produce here.'"
That includes things like setting page limits on written exercises, which were not included in the first assessments. As a result, teachers wrote wildly varying amounts.
More focused exercises also should prove less overwhelming for classroom teachers, most of whom have precious little time to devote to their portfolios.
Math teachers who looked at the E.T.S.'s original assessments, Ms. Pearlman said, thought the exercises were wonderful but could not imagine doing them.
"That's a bad thing to hear," she said. "You're not satisfying your customers."
Videotapes and samples of students' work have proved to be more compelling evidence about teachers than their own written commentaries, Mr. Smith said. One reason is the lack of corroboration for what teachers write about themselves.
One teacher described herself, for example, as "student centered," yet on videotape asked and answered all the questions during a discussion with her students.
Coupled with well-designed questions about how a teacher brought about a classroom session, videotapes are promising tools for assessing teachers, Mr. Smith said. Other sources of evidence can then verify what the teacher says about the videotape.
Assessment developers are learning, by trial and error, how to write directions and ask questions to generate the information they need to best use this technology.
"A videotape of kids presenting a project may be very funny and very charming," Ms. Pearlman said, "but you cannot assess a teacher based on it."
Looking at students' work reveals whether teachers are giving worthwhile assignments likely to produce the deep understanding called for in the national board's standards. Assessors also can see how teachers evaluate students' work and what kind of suggestions they make for improvement.
Because the portfolios are compiled in a short time, however, they do not reveal much about what students are learning, Ms. Pearlman said. Teachers who underwent the assessments for the early-adolescence/generalist certificate, for example, chose "very modest" things to talk about when asked to write about students' progress over three weeks, she said.
Assessment developers at her laboratory, Ms. Pearlman said, received folders "so fat they were almost unmanageable" when they asked teachers to submit samples of the work of five math students at the beginning, middle, and end of a three-week unit.
"The big news was, we didn't learn anything," she said. "If students were really wonderful on the first assignment they did everything well, and if they were a turkey, they were all pretty bad."
Assessment developers also have learned that teachers are a bit suspicious about being tested. Most are conditioned to try to find the hidden goal of a question.
No Hidden Goals
But the national board is assessing teachers against standards for accomplished practice in their field. Candidates are told exactly which standards apply to a given exercise.
These attitudes became apparent when groups of candidates gathered to set the performance standards that determined which teachers would become certified, Mr. Smith said.
"Invariably, they would say, 'Oh, you really did mean what you said in the instructions,'" he recalled. "Somehow, they think there's something else there."
Interviewing teachers about their work at the assessment centers that teachers visited after completing their portfolios proved to be more difficult than expected, Ms. Pearlman said.
Early-adolescence/generalist candidates, for example, were asked questions about videotapes they made of a particular lesson. But because of the difficulty finding, training, and paying for highly skilled interviewers, talking to teachers about their practice is not as revealing and cost-effective as other assessment methods, she said.
And while some people have urged the national board to include classroom observations and detailed feedback to candidates, financial constraints make those steps unlikely.
One idea that surfaced at Far West Laboratories, which until last month was under contract to produce science assessments, was to have candidates complete their portfolios and assessment-center exercises in two separate steps.
Candidates who scored poorly on the portfolio, for example, could skip the assessment center. That would have saved the organization money.
Steve Schneider, the co-principal investigator of the project, said the San Francisco-based laboratory had also proposed a one-day assessment-center visit, rather than two days.
Over time, said Ms. Pearlman, technological advances--such as using the Internet computer network and compressing videotapes to make viewing time shorter--also could make the assessments less expensive.
The assessments are costly in large part because they are a brand-new technology that the national board has chosen to treat cautiously, assessment experts said. National board officials estimated the cost of assessing the first group of teachers at $4,000, far beyond the $975 fee charged candidates.
"What you have at the moment is something very much like a space shuttle, in the sense that every system has a backup system," Mr. Shulman said. "The question now is how far to reduce a lot of that redundancy and keep the assessment safe for everybody to use."
The teachers who sit on the national board's 63-member board of directors feel confident that the assessments reflect good teaching practice, said Joyce Ojibway Jennings, a San Diego teacher and board member.
The Detroit-based organization also has received praise from teachers who took part in field-tests.
"They say how good they think it was, that it did tap into what they do every day in the classroom," Ms. Jennings said. "If teachers out there don't see it as credible, it doesn't matter how much we value the test or whether people say teachers should be board certified. For me, that was even more important than the psychometrics."
The "Review Session" series is made possible by a grant from the John D. and Catherine T. MacArthur Foundation.
Vol. 14, Issue 36