Educators have long complained that state accountability systems that use only test scores and graduation rates to rank or grade schools oversimplify school success, confuse parents, and mask achievement gaps.
Now, education leaders and lawmakers in states including California and Kentucky have said they want to toss one-size-fits-all school rankings and give the public information on school performance in more areas—similar to the way indicators are compartmentalized on the dashboard of a car.
By adopting a dashboard-style approach imported from the business world, rather than A-F or pass-fail models, supporters of the alternative approach hope to eliminate competition between schools and districts, decrease schools’ focus on test scores, and give principals an incentive to target a wider array of areas, from chronic absenteeism to access to the arts.
Interest in data dashboards has gained traction in recent months with the passage of the federal Every Student Succeeds Act, the successor to the No Child Left Behind law. ESSA will allow for more creativity in school turnaround approaches and require states to include an additional performance area in their accountability systems, such as discipline, social-emotional learning, or attendance. Several large urban districts, including Atlanta, Los Angeles, Philadelphia, and San Francisco, already use dashboard-style accountability systems.
“We started with the premise that [accountability systems] should be about learning and continuous improvement and not about finding fault or punishment,” said Rick Miller, the executive director of the CORE districts, a group of California school districts that have been allowed wider flexibility in crafting their own accountability system and which use a dashboard model.
“They should be a flashlight and not a hammer,” Miller said of such systems.
And the California state school board voted last week to adopt an accountability system in the fall of 2017 that would list for the public achievement and growth on test scores, school climate, and suspension and graduation rates, among other things.
Earlier this month, the Council of Chief State School Officers held a webinar on dashboard accountability systems for the 11 states in its Innovation Lab Network, a kind of incubator for accountability and instructional redesign.
How schools are labeled by statewide accountability systems can affect home property values, lead to an exodus of students, and, for those schools deemed failing, mean firing staff members and handing over control to charter operators.
Dashboard systems vary in design and, if done wrong, can further confuse parents and create perverse incentives, principals and state leaders who use the approach say. They also require heavy data collection.
As they move to rework their school accountability systems under the Every Student Succeeds Act, states including California and Kentucky are looking for ways to better display data on school performance. One model: the online dashboard-style report card, which offers an interactive look at a variety of performance measures. Georgia has had such a display since 2012, allowing users to search by school and by indicator:
A dashboard can be presented to school administrators and the public in a variety of ways. Dashboards typically are organized as score cards that list under separate tabs several areas of performance.
The most contentious area is what to do with all the indicators once state departments present them to the public. There’s debate among researchers, educators, and advocates about whether to weigh the factors against one another to then come up with a single score for a school.
“There is no scientifically valid and reliable way to reach this single composite number, or to calculate each specific weight for each metric that would go into computing this single number,” said Michael Kirst, the president of California’s board of education. “For example, how much should suspension count as part of the composite single number compared to chronic absences?”
ESSA, however, still requires that states identify their bottom 5 percent of schools, a potentially difficult task for dashboard systems with several factors that would require using a complicated mathematical formula.
Several researchers also warn that too many indicators will make it hard for parents to compare across schools and districts, and that eliminating rankings can ease pressures on educators to change their teaching habits at schools where just a handful of students are proficient on tests.
“How exciting would the Friday-evening football game be if we didn’t keep score?” said Eric Hanushek, a fellow at the Hoover Institution at Stanford University who has studied international school accountability systems and who advocates that schools, districts, and states be compared using one measurement such as a single test.
“We’re moving to a system that makes everyone feel good about themselves,” he said. “But hiding the fact that you’re doing a bad job doesn’t make it better for those kids, it just makes it better for the adults.”
Since 2012, Georgia has evaluated its high schools using a 60-point scale based on more than 18 indicators, including access to the arts, the number of students taking career-readiness courses, and the number missing fewer than six days of school, for example. “It makes a lot of the data available to educators and communities,” said Allison Timberlake, the director of accountability for the Georgia education department. “Schools can really engage with other stakeholders about what they’re working on, where they’re making progress, and where they want to make improvements.
But there are plenty of challenges,” she continued. “It’s really hard to explain to stakeholders what [our system is] measuring, and it can be cumbersome to get a quick picture about how a school is performing.”
While most states began using accountability systems in the 1990s, the federal No Child Left Behind Act, passed in 2001, required all states to adopt an accountability system that labeled schools based on whether they were meeting the law’s “adequate yearly progress” yardstick, itself based on state standardized tests.
Most states today grade, rank, or categorize their schools based on a combination of test scores and graduation rates. And ESSA still requires that state systems rely heavily on those factors in weighing schools’ performance.
While schools landing at the top of a state’s accountability system are showered with awards, and schools that languish at the bottom are given millions of dollars worth of turnaround intervention efforts and subject to sanctions, schools that land in the middle aren’t given incentives or resources to improve, experts who have studied the dashboard approach in the business and education worlds say.
They also say these sorts of systems create incentives for principals to focus on one or two factors to improve their schools’ ratings and pay less attention to factors that might count less.
What you measure is what you pay attention to, said Harvard business professor Bob Kaplan. After helping design score cards for businesses, Kaplan began consulting with districts in Georgia in the early 2000s on adoption of a similar model that details achievement areas board members and superintendents valued. Kaplan, who now consults with hospitals, said evaluating schools is one of the hardest tasks he’s confronted.
“What they’re trying to achieve is multidimensional,” he said. “It’s hard to aggregate into a single number. You’re trying to improve students’ knowledge and their creativity. That could be across different subjects and look different based on if a child is analytical or artistically gifted.”
Stephen L. Pruitt, Kentucky’s education commissioner, said that as he toured the state to talk about ESSA, several parents and principals told him they didn’t like the fact that the state’s accountability system allowed schools to be compared with one another.
“If my success depends on me beating you, why am I going to go the extra mile to educate our kids?” Pruitt said. “If I have a high-flying district next door to a struggling district, I want to promote them working together to do better for all of our kids.”
California for years has operated with two accountability systems, one under the NCLB law and another of its own design that ranks schools mostly based on test scores.
In 2014, Democratic Gov. Jerry Brown pushed through a statewide funding formula that provides billions of state dollars based on several factors that encompass student needs, achievement gaps, and inequities, with the hope that it would lead to increased student achievement. The state school board promised to adopt an accountability system that aligns with that formula.
The debate leading up to last week’s vote on that system focused on what education factors the state values and how measurements are displayed to the public. Some argue that the system doesn’t meet federal standards.
Using the Information
Barbara Berman, the principal of E.R. Taylor Elementary School in San Francisco, said her school soared under the No Child Left Behind system and sat in the middle of the pack under California’s accountability system.
When the district adopted a dashboard accountability system, Berman learned that her school’s Latino and black students were absent double the amount of days that white students were. She now has a student-absenteeism committee that makes regular phone calls to parents of students who are often absent and rewards students’ progress with bracelets.
While Berman said she’s seen her staff be more focused on target areas and effectively close achievement gaps as a result of having additional data through the dashboard, she finds herself standing in front of school passing out surveys to parents and pestering teachers to fill out reports during staff meetings. “It takes a lot of coordination,” she said.
But she finds all the data collection is worth it: “It shows to parents, ‘I value your voice. I value your input.’ ”
A version of this article appeared in the May 18, 2016 edition of Education Week as States Eye Data Dashboards as Path to Nuanced Accountability