When students are more engaged in school, they are more likely to perform well academically, right?
That idea should be pretty familiar to regular readers of this blog, which has covered reams of research and policy proposals centered around the assumption that schools should work to make students feel supported, safe, and excited about what they are learning.
At first glimpse, a finding in the 2015 Brown Center Report, released this week by the Brown Center on Education Policy at the Brookings Institution, seems to turn that idea on its head.
Comparing international scores on the PISA achievement test with students’ self-reported levels of engagement, Senior Fellow Tom Loveless discovered something that might make supporters of student engagement efforts a little sea sick.
From the report:
National scores on PISA's index of intrinsic motivation to learn mathematics are compared to national PISA math scores. Surprisingly, the relationship is negative. Countries with highly motivated kids tend to score lower on the math test; conversely, higher-scoring nations tend to have less-motivated kids. The same is true for responses to the statements, 'I do mathematics because I enjoy it,' and 'I look forward to my mathematics lessons.' Countries with students who say that they enjoy math or look forward to their math lessons tend to score lower on the PISA math test compared to countries where students respond negatively to the statements. ...
Policymakers are interested in questions requiring analysis of aggregated data—at the national level, that means between-country data. When countries increase their students' intrinsic motivation to learn math, is there a concomitant increase in PISA math scores? Data from 2003 to 2012 are examined. Seventeen countries managed to increase student motivation, but their PISA math scores fell an average of 3.7 scale score points. Fourteen countries showed no change on the index of intrinsic motivation—and their PISA scores also evidenced little change. Eight countries witnessed a decline in intrinsic motivation. Inexplicably, their PISA math scores increased by an average of 10.3 scale score points. Motivation down, achievement up."
While American students are “about average in terms of engagement,” a handful of countries “noted for their superior ranking on PISA,” like Korea, Japan, Finland, Poland, and the Netherlands, scored below the U.S. on student engagement, the report says.
Whoa! Right? So does this mean that U.S. schools should abandon all of their newly adopted engagement efforts and, well, try to make students care a little less?
Not so fast, says Loveless. As our reporters emphasize frequently when discussing international test scores, correlation is not causation. And there are many possible explanations for the correlations Loveless identified.
For one, students in different cultures may have different ideas about the meanings of phrases like “enjoy” and “looking forward” to math, which makes it difficult to make a comparison between countries. It’s not exactly an objective measure. And, within countries, “student-level associations of achievement and other components of engagement run in the anticipated direction—they are positive,” the report says. “But they are also modest in size, with correlation coefficients of 0.20 or less.” (The closer a coefficient is to 1, the closer the correlation.)
“The evidence for the relationship between intrinsic motivation in mathematics and test scores is just not there,” Loveless said in an interview.
Engagement is also somewhat subjective. A kid who feels like a standout, super-motivated student in a high school without the highest standards may feel like a small fish in a big pond in a different, high-achieving school.
There’s also a possible reverse causality in the data, or “the possibility that high math achievement boosts intrinsic motivation to learn math, rather than, or even in addition to, high levels of motivation leading to greater learning,” the report says.
The report presents a snap shot of one year of data, failing to analyze trends over time, and it would be unfair to draw definitive conclusions from that, Loveless writes.
Plus, there is plenty of other research that shows that within the U.S., students perform better if they are more engaged.
Perhaps the biggest lesson here is that data needs to be carefully scrutinized before it is used to support sweeping conclusions. And that applies to the data used to support student engagement efforts as much as it does to the data that could discredit them. That’s an idea we can all engage with.
Photo: Flickr Creative Commons
A version of this news article first appeared in the Rules for Engagement blog.