The Atlantic Monthly has a terrific article on how strike zone cameras accidently ruined baseball. The story goes like this: by 2009 Major League Baseball had installed strike zone cameras in every major ballpark, and they started using these cameras to evaluate umpires. It turned out, that umpires had been particularly unwilling to call strikes down by the knees, at 18-30 inches above the plate. So prior to the cameras, human umpires, batters, and pitchers had collectively defined a common understanding of the strike zone, which was rewritten with this new technology. The cameras didn’t just reduce inconsistency, they created new kinds of consistency. Umpires started expanding the strike zone downwards. This new space for pitchers to work has contributed to declining hits, declining slugging, and as author Derek Thomson argues, more boring games. Many factors, like the increase in faster pitchers and the crackdown on steroids, are involved, but baseball’s ratings are now 25-35% lower than in the middle of the last decade.
It was a timely article, because in the first week of my course Massive: The Future of Learning at Scale, we’d been talking about socio-technical systems, systems composed of humans, computers, and other machines. “The Simple Technology that Ruined Baseball” is another great example of how the introduction of technologies into systems can have all kinds of unintended effects. There are two particularly helpful readings that I turn to for thinking about these kinds of things. One useful set of ideas is Actor-Network Theory, where folks like Bruno Latour have argued that not only do humans and non-humans coexist in systems, but these non-humans are not just our “tools,” they often act in systems as if they have agency independent of their designers and users. My hunch is that the MLB brass wanted to reduce inconsistencies in umpiring, but not to shift the balance of the game in favor of pitchers. The camera, however, has it’s own compunctions. It wants the strike zone programmed into it, even if that definition of the strike zone proves to renegotiate tacit agreements among humans. The tool does more than the tool-wielder anticipates. (You can make your own analogical leaps to what kinds of inconsistencies might have been reduced and how the strike zone has been changed in educational systems by measurement and testing regimes.)
Another useful insight on these measures comes from Larry Cuban’s recent book Inside the Black Box of Classroom Practice, where he argues that one of the most common mistakes of education reformers is to confuse the complicated and complex. Education, he argues, is more akin to the flight of a butterfly than the flight of a bullet. In complicated physical systems, you can predict how new forces will impact other parts of the system. In complex systems, where feedback loops make parts of the system interact in dynamic ways, new forces often have unpredictable outcomes.
Learning systems, and nearly all socio-technical systems, are often more complex than complicated. Educators often times add a technology into a system because of one particular feature: it’s now required by the principal, I want to be able to show pictures, the login is easier, etc. But when we add one technology we don’t just get the feature we want (like a tool for reducing umpire inconsistency), we get an entire feature set with a whole range of impacts that are often hard to predict in advance. But, it is easier to predict issues and respond to them if we’re alert to the possibilities that introducing technology into learning systems is likely to be more than our original bargain.