Opinion
Assessment Opinion

Portfolio Folly

Keeping collections of student work is fine—but unreliable scoring makes the practice anything but effective as an assessment tool.
By Peter N. Berger — March 01, 1998 4 min read
  • Save to favorites
  • Print

It’s no secret that public schools today lack credibility. We sail from one fad to the next on a tidal wave of jargon, trailing the wreckage of our latest “instructional innovation” behind us, and then wonder why regular people find it tough to believe we know what we’re doing.

Take portfolios, for instance. Here in Vermont they tell us we’re “leading the nation.” If so, the news isn’t good for the rest of the country.

What exactly are portfolios? They’re folders where kids keep their work. If this sounds like less than a breakthrough, that’s only because it is less than a breakthrough.

Not that there’s anything wrong with keeping portfolios. Plato probably carried one to Socrates’ classroom. But advocates today promote portfolios as “accurate measures of educational performance,” offering “credible,” “standardized” data. Unfortunately, portfolio scoring simply doesn’t work.

Portfolios are scored according to rubrics, which detail theoretically “objective” standards and criteria. In Vermont, for example, writing is judged in five “dimensions"—purpose, organization, details, voice, and grammar, usage, and mechanics—and awarded one of four scores in each category: Extensively, Frequently, Sometimes, or Rarely. These scores translate into a four-point scale. Innovators naturally insist that portfolio 4.0’s and 3.0’s have nothing to do with old-fashioned 4.0’s and 3.0’s, also known as A’s and B’s.

Suppose you’re trying to rate the details in a piece of writing. If you think the details are “explicit,” you score it Extensively. If they’re only “elaborated,” of course, then you just give it a Frequently. Rating an essay’s organization requires that you distinguish between problems that “affect unity or coherence"—these rate a Sometimes—and those that “make writing difficult to follow"—a Rarely. Scoring the voice used in writing is even easier. All you have to do is detect the difference between a “distinctive” tone, an “effective” tone, an “attempt” at an “appropriate” tone, and an “appropriate” tone that’s “not evident.” Next you can decide if the piece contains errors that “may distract the reader” or errors that “interfere with understanding.”

This is “objective”?

By the way, portfolio backers promote all this on the grounds that parents like the idea of clearly defined standards. Unfortunately, most teachers—and most regular people—have a hard time deciding with statistical reliability whether they’re being “distracted” or whether their understanding is being “interfered” with.

In order to “calibrate” their judgment with that of the “experts,” teachers score sets of writing samples called benchmarks. Inconveniently, Vermont teachers statewide traditionally disagree with experts’ benchmark scores more often than they agree with them. Even the “experts” commonly disagree with each other. Which scores are the right scores? You tell me.

In an effort to eliminate the benchmark warfare that has plagued Vermont’s scoring system from the start, officials introduced “line pieces.” These new samples were supposed to indisputably define the absolute best a kid’s composition could be before it deserved the next higher score. Unfortunately, this remedy rested on the breathtaking assumption that raters previously unable to agree on the difference between an “interference” and a “distraction” would somehow be able to agree on the difference between the very best “interference” and the very worst “distraction.” Not surprisingly, line pieces haven’t helped clear things up.

As a result, portfolio scoring remains notoriously unreliable. Even judges rating the same paper disagree on the score it deserves. The RAND Corp., the state’s former assessment consultant, found initial Vermont scoring reliability “so low that most planned uses of data had to be abandoned.” RAND described scattered improvements in succeeding years as “slight” and “trivial,” with some reliability figures actually declining and “large margins of error” consistently contaminating data. RAND’s final report cited only “limited evidence from other programs that reliable scoring of writing portfolios is practical.”

Portfolio debacles aren’t limited to Vermont. A report commissioned by Kentucky’s legislature blasted that state’s portfolio-based system as “seriously flawed,” concluding that “the public is being misinformed” about statewide results and “misled about the accomplishments of individual students.” Portfolio programs are also proving far costlier than anticipated. In the words of one RAND analyst, the “early infatuation” with portfolios was “unrealistic.”

Enthusiasts frequently exaggerate the role portfolios themselves play in learning. It’s wrong to credit portfolios with the benefits delivered by sound writing and problem-solving strategies simply because kids employ these techniques to complete the work for a portfolio. And it’s certainly wrong to misrepresent teacher support for keeping portfolios as an endorsement of portfolios as an assessment tool. You can use portfolios in your classroom without ever engaging in the pseudo-objective scoring, the voluminous record keeping, and the ghastly expense in both money and classroom time.

Portfolios were devised to provide “meaningful, useful data.” They don’t. That’s why they should be abandoned as assessment tools. The last thing our schools need is more sound and fury signifying nothing. Addiction to fashion and blindness to folly won’t build anyone’s confidence in public education.

A version of this article appeared in the March 01, 1998 edition of Teacher Magazine as Portfolio Folly

Events

Jobs Regional K-12 Virtual Career Fair: DMV
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Cardiac Emergency Response Plans: What Schools Need Now
Sudden cardiac arrest can happen at school. Learn why CERPs matter, what’srequired, and how districts can prepare to save lives.
Content provided by American Heart Association
Teaching Profession Webinar Effective Strategies to Lift and Sustain Teacher Morale: Lessons from Texas
Learn about the state of teacher morale in Texas and strategies that could lift educators' satisfaction there and around the country.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Students Can Hear Questions Aloud When They Take Many Tests. Does It Help?
Text-to-speech tech helps some students answer questions correctly, but hurts others' performance.
2 min read
Young student in a school computer lab concentrates on a laptop while wearing pink headphones; classmates work nearby in a bright, collaborative learning environment focused on technology and study.
Vanessa Solis/Education Week + Getty Images
Assessment Opinion Learning Is Dynamic. Grading Should Be, Too
The traditional way of grading students isn't helping them, argues Thomas R. Guskey.
Thomas R. Guskey
4 min read
Grading Papers
Shutterstock
Assessment Spotlight Spotlight on Turning Spring Assessments Into Actionable Literacy Insights
Turn spring literacy scores into action! Learn how smarter data use, growth-focused grading, and instruction can drive real progress.
Assessment Letter to the Editor The Truth About Equity Grading in Practice
A high school student shares his perspective of equity grading policies in this letter.
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week