Corrected: An earlier version of this article misstated the percentage of 4-year-olds who did not show up for the 2016-2017 school year at the CAP Tulsa Head Start program, despite being enrolled. The correct proportion is 20 percent.
What do you do when you build a preschool class—but many of the children never show up?
That’s what happened at the Head Start program overseen by the Community Action Project of Tulsa in Oklahoma, or CAP Tulsa for short. In September 2016, 135 preschoolers—fully 20 percent of the program’s Head Start population—never appeared at the start of the school year, even though their parents had enrolled them.
CAP Tulsa, as it has often done in the past, turned to data both to figure out the problem and devise a solution. And in doing so, it provided an example of how all of Head Start’s 1,600 grantees are now expected to infuse data into their decisionmaking and continuous-improvement processes.
CAP Tulsa offers care and educational services for newborns through preschoolers. But for 4-year-olds, there’s competition. Parents have the option of staying with Head Start or enrolling in preschools offered by the Tulsa school district or local charter schools.
Using Data to Spot Trends
To better predict the program’s enrollment, Cindy Decker, CAP Tulsa’s director of research and innovation, and her team built a statistical model. The model found some common elements among no-shows: They had an older sibling in elementary school, suggesting parents may want their younger child in a preschool at the same building for convenience; they were new to the program that year, or they were not receiving behavioral or disability supports—children with those needs tended to stick with CAP Tulsa, Decker said.
Armed with that information, staff members started asking parents over the summer about their plans, Decker said, paying particular attention to families who had factors more likely to make them no-shows. CAP Tulsa also connected with the district and with local charters to find out if the same children were popping up on their rolls.
A year later, the number of no-shows dropped from 135 to 99—still a lot, Decker said, but the decrease meant less churn in the first weeks of the school year.
“And we also heard that this helped with some challenging behaviors,” she added, because teachers were able to focus on instilling classroom routines, she said, rather than adjusting to new children enrolling well into October.
This is just one of many ways CAP Tulsa uses data to drive its program, Decker said. “Data helps us identify the problems that need to be fixed, and the successes we should celebrate,” she said.
Head Start programs have traditionally collected reams of information on themselves and their participants. But that information has often been collected to monitor compliance, not to drive program improvement or better child outcomes.
Grantees in the field wanted to improve their use of data, said Yasmina Vinci, the executive director of the National Head Start Association, an advocacy group representing the nation’s 1,600 Head Start grantees.
The association was among the groups that commissioned a 2016 report called “Moneyball for Head Start.” The paper drew its name from the analytical approach popularized by the Oakland Athletics then-general manager and now vice president, Billy Beane. Beane used statistical analysis to put together competitive baseball teams, rather than relying solely on the intuition of baseball scouts. Head Start programs should embrace data in the same way, and should be supported by the federal government in doing so, the paper stated.
From Compliance to Performance
Later that year, Head Start released a new set of performance standards, which had last been revised in 1975. Woven throughout the document are requirements for programs to use data in making decisions on issues such as budgeting, teacher coaching, and improving instruction.
“We’re really excited about it,” Vinci said. “The fact that quite a little bit of the energy and movement in this has come from the field, really makes it a powerful opportunity.”
The performance standards require a shift in mindset, and Head Start is providing technical support in a variety of ways, federal officials said. For example, they have focused technical assistance at the national, regional, and local level on “practice-based coaching,” or using data to support teacher professional development.
In addition, Head Start has offered a “data boot camp” to more than 400 Head Start staff members and technical assistance providers, aimed at boosting their abilities to use data to plan and measure program impact.
Federal oversight has also zeroed in on looking at how programs use the information they capture on students and families, and program efforts. For example, when monitoring review teams visit a grantee, they ask for a “data tour,” where local officials show how they collect, analyze, use, and share information.
Many programs have already demonstrated that they’re effective at this work, federal officials said. Others still need more support, a process that federal officials said is “delicate and ongoing.”
The Riverside County, Calif. board of education is another example of a program that has embraced these requirements. The board provides Head Start services directly to children, as well as oversees several subcontractors, known in Head Start as “delegate agencies.” In total, Riverside County serves about 3,500 children in Early Head Start and Head Start.
“When data first came on the scene with Head Start, it was something that everyone shied away from or was a little afraid of—what did they mean by this?” said Esmirna Valencia, the executive director of Riverside County’s early-childhood programs. “We knew at the time that we needed to introduce data in a way that made sense to the staff.”
Program managers started talking about how they already used data in their everyday work, without necessarily using the term “data-driven decisionmaking.”
Program leaders also hired staffers who were able to look under the hood of the data-management systems already in use, to see if they could tweak them for Riverside’s own purposes.
ChildPlus, a data system used by many Head Start programs, captures dozens of data points on children and families, said Fernando Enriquez, a coordinator with the Riverside County Head Start program. ChildPlus also allows users to generate basic reports, but the creators allowed Riverside access to the guts of the database, so it could produce its own reports.
Riverside linked the database to a visualization program called Tableau. “Now, it’s only limited by your ability to make analytics,” he said. For example, Riverside now maintains a “dynamic dashboard” of enrollment information. Managers can see at a glance which programs are full, which ones need more children to fill open spots, and how many potential students still need to have their eligibility confirmed.
Targeting Teacher Improvement
Another Head Start grantee, Guilford Child Development Center in North Carolina, uses data to drive teacher improvement. Guilford serves around 1,200 infants, toddlers, and preschoolers.
Federal officials use a tool called CLASS—the Classroom Assessment Scoring System—as an important part of their evaluation of Head Start programs. Programs that fall below a certain level on CLASS data and other metrics are required to recompete for federal funding.
Guilford has its own trained CLASS assessors on staff, who observe classrooms on a regular schedule. Federal officials do not require their own CLASS assessments—but seeing how Guilford compares to other programs in the state and nationally is essential for focusing professional development on the most important areas, said Robin Sink, an educational coach specialist for the program.
But Sink noted that as a coach, ease with analyzing numbers cannot replace developing a connection with the teachers she works with.
“I need to meet them and establish a base of trust,” Sink said. “The building of a relationship is more complicated than sharing the data.”
The use of data for continuous improvement is not limited to Head Start managers. Teachers are also using assessments of their students to make day-to-day decisions about how to best support children.
In Riverside, for example, Head Start teachers have been provided up-to-date access to data on their children, through a program called Learning Genie. Teachers plug in observations and assessments, and the program creates interactive reports for educators and for parents.
Boris Sanchez, a Riverside Head Start teacher, said she checks the program daily to monitor her pupils’ progress. It guides which children she might work with individually, which ones she puts together for small-group activities, and how she will focus her lesson plans.
For example, if her charges are interested in learning about butterflies but are also showing they need support learning their letters, “I’m going to merge the letters with the lesson. We merge the technical stuff with the fun stuff.”
Sanchez said the data efforts at continuous improvement fit with the work she has been accustomed to.
“We all had our checklists. I’m not afraid of data, because we were always doing it,” she said.
Coverage of continuous-improvement strategies in education is supported in part by a grant from the Bill & Melinda Gates Foundation at www.gatesfoundation.org. Education Week retains sole editorial control over the content of this coverage.
A version of this article appeared in the June 20, 2018 edition of Education Week as Head Start Programs Turn to Data for Problem-Solving