Political activists want change now; those benefiting from the status quo want it to occur as slowly as is required for them to maintain their positions and benefits; politicians triangulate somewhere between what they think is right, what is possible, and what will advance their careers; academics want to do more research; those who have developed products or programs want them to be purchased and used. The media informs the public. The business of public policy analysis is to identify the costs and benefits to society and the allocation of risks and burdens to stakeholders associated with the decisions about different paths that society might take.
As a practical matter, policymaking does not divide up so neatly. Sometimes positions taken by members of the status quo that slow change are the right position – but it’s hard to tell at the time. Activists and developers use the quantitative tools developed for the social sciences; sometimes analysis bears them out, more often the matter is no “slam dunk.” Often, the state of the art applied to emerging policy problems is itself subject to great uncertainty – and also an arcane political arena for those who develop and use these tools. Academics, policy analysts and politicians have their own biases and agendas. The media typically lacks the capacity to distinguish analysis from advocacy and generally prefers to focus on controversy between people and camps. Highly intelligent people are like everyone else, and when emotions run hot they can take policy debate straight down to the ad hominem floor. Random events can have a huge impact on the public mood, pushing policy in directions that bear little relationship to anyone’s understanding of the problem.
I was reminded of all this reading the exchange among eduwonkette, the Quick and the Ed’s Kevin Carey, eduwonkette’s guest blogger Skoolboy and a cast of interested people offering comments on the extent to which value added-analysis is ready to be integrated into formal assessments of teachers’ performance as school district employees.
Readers of edbizbuzz know that I believe it’s coming regardless of whether it’s fair, supported by evaluation, or likely to get the nation to higher levels of student performance. (See here and here.) My view is that it’s inevitable because: 1) it is data that student information systems will produce; 2) it is easier to use this data to make decisions about the part a teacher plays in student learning than the judgment calls managers’ use today; and 3) it has happened in so many other spheres based on central planners’ input-output conceptualization of work – and especially industry, which is the organizational structure of public education today – that there is no reason to believe it won’t happen here.
The question is not whether, but how fast it will come to dominate. My response has been to move towards teacher professionalism and adaptive management, neither of which were part of this particular discussion. Ironically, what interested me about this debate was the question of when.
Eduwonkette argued that value-added systems are not ready for prime time, and provided a number of plausible reasons why. She did not state how many of these problems needed to be solved before value-added could reasonably become part of teacher performance assessment or the degree to which they needed to be solved. Kevin Carey imputed a 95 percent confidence standard and, under repeated questioning from others, set his own at 51 percent. His argument was that academics like eduwonkette want something close to perfection before changing horses, but the state of American education is such an emergency that he would rather switch to a stead that he believes has a better than even chance of improving on the performance of his current mount.
Carey told us where he stands, and while the academics in this debate would probably accept something less than a system that will perform better than today’s 95 out of a hundred times, he does have a point about the academy’s conservatism. At New American Schools it was very difficult to get our design teams of prominent academics to settle on “final” versions of their Comprehensive School Reform program materials. There was always one more thing, and some level of imperfection that appeared overwhelming to the development staff, but quite invisible to teachers, let alone me as a representative of the investors.
If the problem of academics is excessive conservatism, the problem of change advocates is forgetting that every solution creates a new set of problems and failing to consider potential solutions to those problems before settling on a strategy. Carey’s “damn the torpedoes, full speed ahead” approach to the role of value-added in teacher performance is hardly unique. It typified the early charter school advocates’ decision to get charter schools quickly by creating “multiple chartering authorities” rather than establishing “objective criteria for charter formation” that could be enforced in court – a position Paul Hill, Robin Lake and I argued for at the time, embodied in the first legislative attempt at charter schools in Washington state and killed in no small part by the multiple authority crowd. By making the decision to charter a matter of political preference and forum shopping, the movement set itself up for the never-ending parade of stories about questionable charter schools, and created the very conditions that make many state legislators reluctant to raise charter school caps.
Today charter schools are in court debating aspects of state laws that might give their charter an objective basis, but the advocates’ effort to “do something fast” because they considered public education to be in such dire straights and saw the political opportunity to act, has deprived the objective criteria strategy of its initial potential. It’s not the stellar schools that are going to court.
Value-added teacher assessment systems are coming, and we know they are broken. To paraphrase something a colleague once told me, we also know that only users can tell us which breakdowns are important, and how they might be fixed. My own view is that value-added reviews are needed, but if they are not introduced with some sensitivity to the status quo, they are likely to be constrained by politics and of limited utility in managing school improvement over the longer term.
My response to this forseeable problem is to use value-added analysis for all the inputs that relate to student outcomes, to put the results and methods on the decision table for all to see, and to use that information to try to narrow the range of disagreement and debate on key school improvement decisions. I would argue that this would move faster if teachers were treated as legally recognized professionals, and if principles of adaptive management were introduced to public education.
It is entirely likely that discussions around this table would begin with the blame game and ad hominem attacks, but it would not be long before the majority of people who want to advance the ball would find that approach unhelpful and unacceptable. Politics would not end, but they could be channeled. We might not all agree, but maybe we could all get along a little better too.
Marc Dean Millot is the editor of School Improvement Industry Week and K-12 Leads and Youth Service Markets Report. His firm provides independent information and advisory services to business, government and research organizations in public education.
The opinions expressed in edbizbuzz are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.