Students in Connections Academy’s full-time online charter schools are highly mobile and often enroll after the school year starts.
But nearly half reported choosing a cyber charter because they were looking for greater flexibility or were generally dissatisfied with their local school—far more than those who said they were trying to solve a specific problem such as academic struggles, physical or mental health issues, or bullying.
Those are just some of the findings in a series of studies recently released by Pearson, the global publishing and education giant that serves as a parent company to Connections Academy.
“The research underscores the importance of mobility in understanding online school students” and “reveals insights about what drives student mobility,” said Matthew Wicks, the vice president efficacy research and reporting for Pearson Online & Blended Learning, in an emailed response to questions.
“With this knowledge, we can further improve the Connections Academy online school program to best serve student needs,” Wicks said.
All told, Connections Academy schools served more than 70,000 students across 27 states during the past school year, making it the second-largest operator of full-time online schools in the country, behind K12 Inc.
As part of its new research, Pearson also challenged a series of recent studies by third-party groups, which have consistently found that students in cyber charters tend to perform significantly worse academically than their counterparts in brick-and-mortar schools.
Like other cyber operators, Connections maintains that such studies have not adequately accounted for what it says are high rates of student mobility in virtual schools.
Using its own methodology, which sought to account for students who bounced from school to school before enrolling in a Connections Academy, Pearson found that Connections schools actually performed on par with comparable brick-and-mortar schools, and significantly better than other virtual schools in reading.
Outside research experts questioned Pearson’s approach, however, saying the company’s inability to compare the performance of individual students undercut the strength of its argument that Connections Academy virtual schools perform better academically than they are given credit for.
“You can’t make these claims of effectiveness with school-level data. Period,” said Ruth Curran Neild, the previous director of the federal Institute of Education Sciences and current director of the Philadelphia Education Research Consortium.
Still, the field would benefit from a more robust way of comparing students from different types of schools while better accounting for student mobility, Neild said—an agenda that the current school-choice-friendly U.S. Department of Education may want to consider.
Who chooses Connections Academy cyber charters?
Pearson officials said they undertook the studies as part of the company’s “larger and overarching commitment to efficacy research and reporting.”
The work was independently reviewed by the research group SRI International and audited by the consulting firm PwC.
One part of the work sought to better understand the students who enroll at Connections Academy virtual schools—a key question, given ongoing debates about whether full-time online schools serve a population that is similar enough to brick-and-mortar schools to allow for apples-to-apples comparisons of student performance.
For the study, Pearson analyzed the achievement scores, attendance and enrollment patterns, demographic characteristics, and stated reasons for choosing to attend a Connections school for 77,541 students during the 2015-16 school year. Using a technique called a cluster analysis, the company created seven distinct profiles of Connections Academy students:
- Academically advanced students (8 percent of the overall Connections Academy population)
- Academically struggling students (11 percent)
- Students experiencing physical or mental health problems (11 percent)
- Newly enrolled students who had previously experienced bullying (13 percent)
- Students who had originally enrolled at Connections Academy with challenges such as those listed above, and were now returning to that online school after for a second (or more) year (11 percent)
- Returning students who didn’t report experiencing such problems at traditional schools, but instead enrolled at Connections Academy because they were seeking more flexibility and choice (16 percent)
- New students who were just seeking more flexibility and choice (31 percent)
The last two profiles accounted for 47 percent of Connections students, the study found. More research needs to be done to better understand the experiences and motivations of these groups, the researchers said.
Also noteworthy were the high numbers of new students who enrolled at a Connections Academy after the school year started.
More than half of new students who chose a Connections online school because of physical or mental health problems started late, and nearly two-thirds of new students who experienced bullying or academic struggles at their previous schools started late.
By comparison, Pearson found that more than 90 percent of returning students—including those who originally chose Connections Academy because of previous challenges—started the school year on time.
Connections officials said the information was helpful to both teachers and administrators, and would be used to improve the way Connections schools approach “onboarding” new students.
“Data analysis we have done previously shows that late-enrolling students tend to have lower academic performance than students enrolling on time,” Wicks said. “This can indicate other problems in the life of the student/family that could impact student learning.”
Disagreements about online student performance
While the full-time online student profiles and data released by Pearson shine new light on populations that have often been hard to track, the company’s findings regarding academic achievement are more contentious.
A number of previous studies by independent researchers have slammed cyber charters in general for poor performance. Most notably, a 2015 report from the Center for Research on Education Outcomes at Stanford University found that cyber charters in general have an “overwhelming negative impact” on students’ academic growth.
In that study, researchers matched individual students attending 158 cyber charter schools in 17 states and the District of Columbia with “virtual twins,” who were similar in terms of grade level, demographics, poverty, special-education status, and prior performance on state tests. The virtual twins attended the brick-and-mortar school where their peers most likely would have landed had they not chosen to attend a cyber charter.
In its comparative analysis of that student-level data, CREDO found that in a given year, online charter students, on average, achieved the equivalent of 180 fewer days of learning in math and 72 fewer days of learning in reading than similar students in brick-and-mortar schools.
More than two-thirds of cyber charters had weaker overall academic growth than similar brick-and-mortar schools, CREDO found.
Cyber charter operators, including Connections and K-12 Inc., have consistently said CREDO’s methodology was limited because it didn’t adequately account for student mobility.
“The negative impact of student mobility on academic performance has been well documented,” Wicks said. “We believe taking mobility into account is required to make the most fair comparison.”
To that end, Pearson attempted to account for student mobility by using each state’s mobility metric, which vary considerably from one place to the next and are only available at the district level. The company then used those mobility rates as a primary indicator for how it matched Connections Academy schools with counterparts.
Unlike CREDO, however, Pearson was unable to create matches at the student level. Instead, it matched Connections Academy schools with traditional brick-and-mortar schools in the same state by grade level (3-8) and subject area (math and reading.) In effect, that meant 4thgrade math students at a given Connections Academy schools were compared to the most similar class of 4thgrade math students that researchers could find within a brick-and-mortar school in the same state.
The two most significant factors in making those matches were mobility rate and prior academic performance.
Using this methodology, Pearson found that Connections Academy full-time online schools performed statistically the same in reading and math as brick-and-mortar schools. Connections schools also outperformed other virtual schools in reading.
“This study provides evidence that students from Connections Academy schools can perform at the same level as students from traditional schools that serve similar student populations,” Wicks said. “The results support a more complex conversation about mobility and virtual schooling.”
More work to be done
Outside researchers had much more measured reactions, however.
In an interview, Neild of the Philadelphia Education Research Consortium, emphasized that Pearson’s inability to compare the performance of individual students is a major limitation of the study.
“We don’t know if these are the same students from one year to another,” Neild said. “The fact is, [Pearson doesn’t] have the evidence to parse out whether the differences they’re seeing are the result of an effective instructional program, or changes in the student population.”
In a statement provided to Education Week, director Macke Raymond said CREDO “appreciates the effort [by Pearson] to delve deeper into school effects among cyber schools” and noted that the new study “opens a new frontier of investigation” by including student-mobility measures.
But those measures are “far more exploratory than confirmatory,” said Raymond, who also pointed out the superiority of CREDO’s student-level analysis.
“We are confident that the prior CREDO study meets standards of rigor and accuracy, and stand by our results,” she said.
In its study, Pearson acknowledged that its study “cannot support causal conclusions” and that a “more rigorous research design would have matched groups at the individual student level, rather than the school or district level.”
A version of this news article first appeared in the Digital Education blog.