What’s the demographic profile of students who opted out of New York state tests? One researcher at the Brookings Institution has tried to answer that question.
Based on data he collected and analyzed, here’s the short answer: Wealthier districts tend to have higher opt-out rates, but don’t bet your life savings on that trend.
Matthew Chingos, the research director of the Brown Center on Education Policy at Brookings, used opt-out data compiled by United to Counter the Core, as well as U.S. Census data on student eligibility rates for free and reduced-price meals in districts, enrollment figures, and test scores from 2014. (When I wrote about opt-out earlier this year, I discussed United to Counter’s data, which is in part based on parent reporting—official opt-out figures from the state won’t be available until next month.)
Of the 648 Empire State districts Chingos studied that had complete data, he found that when districts’ size was taken into account, the opt-out rate was 21 percent. And how else did opt-out rates correlate with data about income backgrounds and test scores? Here’s the answer Chingos provided in chart form:
The chart shows that, in addition to the general trend of relatively affluent districts having higher opt-out rates, “districts with lower scores have higher opt-out rates.” And larger districts, he noted, tend to have the lowest opt-out rates.
However, Chingos cautions against jumping to the conclusion that high opt-out rates are invariably associated with relatively affluent districts, writing in his report that, “There is a clear association, with more disadvantaged districts having lower opt-out rates, on average, but also a large amount of variation in the opt-out rate among districts with similar shares of students eligible for the subsidized lunch program.”
Here’s how the correlation looks between opt-out rates and districts’ percentage of students eligible for free and reduced-price meals, according to this analysis from Chingos:
“The correlation is not as strong as some people might have expected,” Chingos told me in an interview, noting that some of the data points fall pretty far above and below the trend line. And he also said that it can be problematic to make inferences about individual students from the kind of aggregate data Chingos is dealing with.
Chingos stressed that the data regarding opt-outs is incomplete, and there’s likely some misreporting of opt-out figures by United to Counter the Core.
In addition, there are factors that the opt-out data don’t necessarily capture, such as what Chingos called “cultural norms.” What does he mean by that? In some cases, opt-out rates can hinge at least partially on who’s opting out—not just individual districts, but in individual buildings. In many individual schools, for example, there might be a tipping point after which students feel increasing social pressure to not take the tests.
“When everyone’s opting out, it’s weird to be the kid who didn’t opt out,” Chingos told me.
And there’s the possibility, he said, that some districts might have encouraged some students to opt out to mask what would have otherwise been poor performance in the district. Conversely, we also don’t know whether districts where superintendents make extra efforts to ensure test participation had lower opt-out rates.
Now that you’ve digested all that, be sure to check out the great presentation on the opt-out movement from the Commentary team at Education Week.
A version of this news article first appeared in the State EdWatch blog.