Schools Struggle to Use Data to Spark Improvement
Can educational data be used to actually help schools get better?
That's the hope behind a push to bring to K-12 schools a management philosophy known as "continuous improvement" that has flourished in fields such as health care, manufacturing, and social services.
But experts contend that the K-12 education system's current data infrastructure, built in response to the federal No Child Left Behind Law and focused primarily on holding schools accountable, looms as a significant barrier.
"New data systems really have to allow [teachers and principals] to say what they want to accomplish, and determine whether they've accomplished it," said Louis Gomez, a professor of education and information studies at the University of California, Los Angeles, and a co-author of the 2015 book, Learning to Improve, How America's Schools Can Get Better at Getting Better. "Just monitoring how we're doing is not going to help us get better."
Examples of what an improvement-based data infrastructure actually looks like in practice are few and far between.
Wisconsin's 4,000-student Menomonee school district is lauded for using a comprehensive approach to steadily improving everything from recess safety to the cleanliness of its classrooms, but educators and administrators there have relied on mostly informal systems for collecting and analyzing information.
The 400,000-student Chicago school district has been at the forefront of efforts to use 'early-warning' data to keep high school students on track to graduation and college, but such work typically relies on existing data systems to support a narrowly targeted improvement project. And the New York-based nonprofit New Visions for Public Schools has taken the lead on developing brand new data systems designed to help schools collect and track the kinds of "process data" that continuous-improvement proponents describe as critical, but such tools remain on the fringes of the K-12 sector.
Far more typical are districts like Dallas, where budget woes and leadership churn have hampered efforts to build and use data systems in new ways.
As a field, education has "placed tremendous emphasis on data for accountability purposes," said Mark Dunetz, the president of New Visions for Public Schools. "We've done a much poorer job at providing practitioners with the tools they need to manage the types of tasks that are the day-to-day bread and butter of schools, and that enable everything else to happen."
Little Impact on Teaching
The idea of schools making better use of educational data is nothing new.
Back in 2010, the U.S. Department of Education described data-driven decision making as a "national priority," saying that states and school districts had made significant progress in building a new data infrastructure.
But in a report that year titled "Use of Education Data at the Local Level: From Accountability to Instructional Improvement," the department also concluded that such work was having little impact on actual classroom teaching.
"School staffs' perception of barriers to greater use of data include a sense of lack of time, system-usability issues, the perception that the data in the system are not useful, and district policies around curriculum coverage or pacing that prohibit modifying learning time to match student needs," the report concludes.
Fast forward eight years, and the lingo has changed, but such challenges are still endemic, said Andrew Krumm, the director of learning analytics at Digital Promise, a nonprofit that promotes the use of technology to improve schools.
Too often, Krumm and other proponents of continuous improvement say, schools get caught in a cycle in which administrators and policymakers identify a problem; develop a program or intervention to address that problem; roll it out; and try to determine whether or not it worked. The data used to measure success is typically a long-term outcome measure, such as standardized test results. If scores go up, administrators try to scale the program across other schools and districts—often with little consideration for why the program worked, or how conditions on the ground in other schools and districts might vary.
That approach amounts to "implementing fast and learning slow," Gomez and his co-authors from the Carnegie Foundation for the Advancement of Teaching argue in their 2015 book.
Instead, they argue, schools should be focused on the reverse: implementing improvement strategies more methodically, but learning much more about them as they happen.
Ideally, they say, such a process would entail identifying the problem schools want to fix; developing a theory about how to improve it; and then helping the people closest to the problem—usually teachers, principals, and other school staff—to develop measures of day-to-day progress that are aligned to that theory. Technology tools should help schools monitor three things: whether they're actually doing what they set out to do, whether it's making a difference on the measures that educators developed locally, and how such efforts impact the kinds of long-term outcomes measures that are typically used now.
"The most interesting work happens when schools supplement system-wide data with finer-grained, day-in-and-day-out data," Krumm said in an interview. "We need to be collecting data that is much closer to what we're actually trying to change."
Tracking Daily Decisions
That's generally the approach that New Visions for Public Schools is trying to take with the custom educational software it's building.
The nonprofit currently provides its data tools to 342 schools in New York City, including 10 charter schools that it directly operates and several dozen district schools for which the group serves as a "comprehensive lead partner."
The focus is on closely tracking the seemingly mundane decisions that principals, assistant principals, teachers, and guidance counselors have to make each day, then getting them the information they need to make those decisions.
Graduation planning is one big example. Additional functions aim to help counselors match students to internships and other opportunities in the most efficient way possible, or to help administrators track student attendance and participation in after-school activities.
"I think we've historically underestimated the complexity of work in schools, where there is a tremendous volume of decisions that need to be made, and the outcomes are often determined by our ability to be consistent," Dunetz said. "If you don't know whether a classroom had the right textbooks, or whether the computers worked, or if kids showed up, it's very difficult to draw any conclusions about whether what you're doing is working, and why."
Notably, though, Dunetz said New Visions determined it had to build that kind of software from scratch. Very few software applications allow schools real flexibility to decide what data are most important to them, then collect and analyze that information customized ways.
In theory, at least, the new technologies that have flooded into schools over the past decade could help. Countless digital tools are now part of students' everyday learning, and many are capable of generating reams of data on everything that a student (or teacher) does.
But much of that software still serves accountability purposes, Krumm said. The experience of the 157,000-student Dallas school system highlights the challenges on the ground to building new data systems to support continuous improvement work.
Back in 2009, the Gates Foundation gave the Dallas district a three-year, $3.8 million grant to strengthen its "college readiness warning system," modeled after Chicago's efforts. The aim was to help teachers use data, pulled primarily from existing student-information systems, to "identify student needs, provide appropriate interventions, and ultimately increase college readiness."
The tool the district developed was "a dashboard pulling up live data" on such indicators as student attendance and access to financial-aid counseling, said Cecilia Oakeley, who started with the district in 2005 and is now an assistant superintendent for evaluation and assessment.
That system took a while to build, Oakeley said. Some schools used it more than others.
And then the grant ran out, the superintendent left, and budget cuts hit.
"The platform went by the wayside," she said. "When there are switches in administration, some work gets lost in the shuffle."
Fortunately, Oakeley said, all was not lost. The district had also built a supplemental data system to allow school staff to also examine other relevant information, such as students' grades in core subject areas. District-level staff still use that system to monitor how many students are on track for graduation at the end of 9th grade. And former superintendent Michael Hinojosa, who led the original push for an early-warning system in Dallas before moving on, is now back leading the district, bringing renewed attention to efforts to get school-level staff to use the information, too.
But for now, at least, that's still a far cry from the type of continuous-improvement data systems that proponents envision.
"This is a process," said Gomez of UCLA. "People are coming to understand that all these [data systems we have now] were not created with improvement in mind. This is a first and important step."
Vol. 37, Issue 24, Pages 10-11Published in Print: March 21, 2018, as Schools Struggle to Use Data to Get Better