Education leaders in North Carolina’s Charlotte-Mecklenburg school district are scrutinizing the habits and grades of elementary school students to determine who may fall off track and fail to graduate from high school a decade or more from now.
They don’t need a crystal ball to make predictions. Officials in the 141,000-student district are relying on a “risk-factor scorecard” to help them spot children who are in jeopardy of becoming dropouts and then deploy resources to help them change course.
Using high-tech data analytics to examine grades, attendance, course failures, declines in grade point average, and disciplinary incidents, Charlotte-Mecklenburg’s scorecard system, which was put in place during the 2010-11 school year, predicts even after the first few months of kindergarten which students are at risk.
District leaders, principals, and classroom teachers are using the information to make decisions about how to deploy resources all across the district.
“This information is very powerful,” says Scott Muri, the district’s chief information officer. “This helps to inform our decisionmaking process about children, budget processes, and human resources. Decisions at every level can be impacted by this.”
From the moment children enter kindergarten, school districts begin collecting information about them. And for years, many districts have tried to build data systems that organize that information and make sense of it. For some districts—and some states—those systems are finally mature enough to look into the future, by using complex data analytics to predict which indicators mean students may go off track down the line.
Some districts use such predictive analysis as an early-warning system for who is at risk of failing to graduate. Others view it through the lens of higher education to determine which students are unlikely to be college-ready by graduation.
Sixteen states now produce early-warning systems that flag students not on track to graduate from high school and relay that information to districts, and 18 others have plans to institute similar systems, according toby Civic Enterprises, a Washington-based policy firm, and the Everyone Graduates Center at Johns Hopkins University, in Baltimore.
“Schools are kind of on overload when it comes to collecting data and talking about data,” says Mindee O’Cummings, a senior research analyst for the Washington-based American Institutes for Research and co-team leader for the, which is working with several states and districts to implement the center’s free early-warning system for high school students and is also developing one for middle school students.
“But when they can really apply that knowledge to make a difference,” she says, “I see a kind of rejuvenation of energy around using data.”
‘Work Smarter, Not Harder’
Many of the predictive models start with 9th graders, but others, like the one used in the Charlotte-Mecklenburg district, start in the earliest grades, says Leora Itzhaki, an academic facilitator for the district’s Elizabeth Lane Elementary School.
1. Determine whether your state already has an “early warning system” for using data to predict future student performance. Make use of that system if one is available.
2. Consult research-based risk-factor criteria shown to accurately predict whether a student is on track to graduate. Examples include the criteria available from the free early-warning system at the National High School Center.
3. Make sure teachers and school leaders get predictive data on a regular basis, understand how to interpret the information, and then use it to come up with intervention strategies.
4. If you’re already using a predictive-data system for high school students, determine whether such a program can be expanded to middle and elementary schools.
Itzhaki says the risk-factor scorecard helps teachers take the initiative in preventing students from falling further behind. When a new student comes into the district, for example, the system will automatically let the teacher know how many times the student has transferred schools, or if the student’s family doesn’t speak English at home, or if the child is much younger or older than peers in the same grade—all deemed risk factors by the district data system.
“The scorecard puts that data out there with the click of a button and makes it really clear,” Itzhaki says. “A lot of this is common sense, but having it grouped together helps teachers work smarter, not harder. You don’t have to dig in a folder for all of these bits of information.”
A strong set of existing research has defined many risk factors for students. For example, researchers say looking closely at credits earned in 9th grade and course grades can accurately predict whether a student is on track to graduate. Absences during the first year of high school are also a criterion that many such systems use.
While many districts use such already-identified factors to build their predictive data systems, others add to them or craft their own.
In the 144,000-student Montgomery County, Md., school district, officials have been using predictive data analysis since 2009, when the district rolled out its “Seven Keys to College Readiness.” The “keys” the district has identified include meeting reading targets in kindergarten and 2nd grade, doing 6th grade math in 5th grade, and having a C or higher in Algebra 1 by 8th grade and an Advanced Placement exam score of 3 or higher by 12th grade.
The keys were developed, says Adrian Talley, the associate superintendent for the office of shared accountability, by looking at district students who were college graduates and then searching back through their district histories for common factors.
Patte Barth, the director of the Center for Public Education at the National School Boards Association, in Alexandria, Va., who is working on predictive data analysis through a grant from the Seattle-based Bill & Melinda Gates Foundation, says school boards have reported that the information gleaned from such analysis “takes the stress out of decisionmaking.”
“The power in this data is that it makes it much easier to defend decisions and give confidence that districts will get a return on their investment,” Barth says. “It helps them identify where the needs are and to align the resources to those needs.”
That’s what happened in the 6,700-student Washington Local schools, based in Toledo, Ohio, which is adopting the National High School Center’s early-warning tool. The district collected risk-factor information from 8th graders and then gathered additional data on grades and attendance for the same students as they started 9th grade.
The system flagged students who had missed three or more days in the first 30 days of school. Teachers who worked with those students were sent a script designed to help them hit specific talking points when approaching students about concerns over their absences.
“It was to let them know that we were paying attention,” says James Nino, the special education co-chair at Washington Local’s Whitmer High School, who is helping to oversee the project on high school risk factors there.
In addition, Whitmer High is starting a mentoring program focused on students the early-warning system has highlighted as being at risk, Nino says. It was helpful to know exactly which students to concentrate on, he says.
“We’re getting a better idea of who we need to service, and trying to make sure we’re not jumping into a mentoring program that is very resource-intensive for all students, when that may not be what every kid needs,” he says.
But it’s important not to stop there, says Marcy Lauck, the manager of continuous improvement for California’s 32,000-student San Jose Unified district, which is close to launching a particularly detailed predictor model that takes into account the more typical factors such as test scores and attendance, but also looks at students’ physical-fitness levels, other health issues, and socioeconomic standing.
Lauck says districts working to predict how students will do must also collect data on the interventions used to move those students back on track.
“The biggest challenge is to understand which interventions are successful for which kids and to collect data on that, too,” she says. “We’re really looking at how to capture that data and quantify it to let us know if we’re being successful.”
A version of this article appeared in the February 08, 2012 edition of Digital Directions as Predicting Performance