Value-Added Measures
We are no longer updating this page.
Federal
Opinion
John Thompson: New York Reformers Show Their True Colors
Guest post by John Thompson.
If nothing else, the controversy over publicizing New York City teachers' value-added scores has revealed the essence of test-driven school "reform." The contemporary data-driven accountability experiment was begun by idealists, who sincerely sought a "civil rights movement of the 21st century," but who were clueless about the realities of urban education, and now it is in shambles. Some honest reformers, like the Washington Post''s Jay's Mathews, seem to ruefully acknowledge that the bubble-in mania produced "sand castles carefully constructed on the beach."
If nothing else, the controversy over publicizing New York City teachers' value-added scores has revealed the essence of test-driven school "reform." The contemporary data-driven accountability experiment was begun by idealists, who sincerely sought a "civil rights movement of the 21st century," but who were clueless about the realities of urban education, and now it is in shambles. Some honest reformers, like the Washington Post''s Jay's Mathews, seem to ruefully acknowledge that the bubble-in mania produced "sand castles carefully constructed on the beach."
Teaching Profession
TFA's Kopp Denounces the Teacher 'Blame Game'
Teach for America founder Wendy Kopp caused some buzz (what else is new?) by writing an op-ed in the Wall Street Journal condemning the public release of teachers' value-added scores in New York City. She writes:
School & District Management
Opinion
The Problem with One-Size-Fits-All Approaches to Teacher Quality
Today's debates over teacher evaluation mostly just leave me tired. On the one side, we've got "reformers" who've accurately identified real problems, suggested sensible principles (like we should work to identify teachers who are better and worse at their jobs)... and then rushed to champion crude, inflexible policies that turn good ideas into caricatures.
Teaching Profession
When Value-Added Scores Don't Make Sense ...
In a July 2011 blog post, I pointed out that the D.C. public school system was using test scores from 100 schools still under investigation for cheating to calculate value-added scores that would eventually be incorporated into teacher evaluations. So some D.C. teachers were at risk of having students enter their classes with falsely high scores, which would make it difficult for those teachers to bring students' scores up.
School & District Management
An 'Unsatisfactory' Teacher Speaks Out
A Brooklyn special education teacher published an op-ed in The New York Times this weekend outing himself as a "bad" teacher, as defined by his district-mandated evaluation.
Teaching Profession
Value-Added Confusion in NYC
I'll admit that I myself was poised to blog about the New York Times' finding that teacher quality is widely diffused throughout the city—which Fordham's Mike Petrilli called "jaw-dropping news." Thankfully a few interviews and deadlines got in the way, and I didn't get to it before Philissa Cramer at GothamSchools roundly exposed that the whole thing is "not really news at all."
School & District Management
Opinion
Don't Like Value-Added? Cool. So Pick Your Poison
As regular readers know, much of my writing on value-added dings would-be reformers for getting waaaay ahead of themselves. They're busy trying to build whole systems around tools that are crude, limited, and relevant for only a portion of what teachers and schools do. That's why I find it troubling that "reformers" are in a headlong dash to use these primitive systems to measure everything they can, or to validate everything else (observations, student feedback, etc.).
Curriculum
Arts Educators Float an Alternative Evaluation Plan
Education officials in Tennessee seem to be making good on their promise to find alternate student-achievement measures to be incorporated into teacher evaluations for teachers in nontested subjects—though it's teachers who are doing much of the heavy lifting in getting the idea moving.
School & District Management
Opinion
Doug Harris Crunches Critics in Value-Added Smackdown
The University of Wisconsin's Doug Harris has torched a couple of would-be critics for their inane, inept, and unfair review of his book Value-Added Measures in Education (Harvard Education Press 2011). For those who appreciate such things, his response is a classic dismemberment of the Education Review take penned by Arizona State University's Clarin Collins and Audrey Amrein-Beardsley. For everyone else, it's important because it sheds light on why it's so damn hard to sensibly discuss issues like value-added accountability. (Collins and Amrein-Beardsley also penned a re-rebuttal, which is fun primarily because it reads like a note from the kid you caught spray-painting your Prius who tells you, "It wasn't me, it wasn't spray paint, I was actually washing your car, and I was only trying to help hide that dent.")
Teaching Profession
Bright Ideas for Teacher Evaluation
Over the last few years, the teacher-evaluation debate has revolved mainly around whether—or to what extent—value-added scores should be involved. Since most researchers and educators agree an evaluation system needs multiple measures, there's also been some discourse around observations—how often they should occur and who should perform them. But for the most part, the same proposals for revamping evaluation systems have been recycled over and over.
Education Funding
Opinion
The State of Teacher Evaluation: Part 2
Yesterday, I shared some interesting facts from the National Council on Teacher Quality's (NCTQ) October 2011 report, "State of the States: Trends and Early Lessons on Teacher Evaluation and Effectiveness Policies" about the evolution of state educator evaluation systems over the past few years. In particular, we learned that between 2009 and 2011, 33 states changed their teacher evaluation policies.