Ask students if they always try their hardest in class, and they might say “yes” because that’s what their mother told them to do. Or maybe if the students are a little older, they say “never,” because they are trying on a carefree attitude to impress their friends.
Those are just some of the problems that researchers encounter when they try to measure engagement by surveying students about themselves.
Enter educational data mining. Long used in business settings, the practice is newer to education. It involves using strings of data collected during computerized instruction to identify patterns. Once identified, the patterns can then be used to improve the instruction by, for instance, eliminating a scenario in which students tend to lose interest and stop playing an educational game. The strategy can be used to track when and why students might be off task and to draw broader conclusions about different aspects of instruction, including learner engagement.
For instance, for a study that appeared last year in the peer-reviewed Journal of Educational Data Mining, Jennifer Sabourin, then a doctoral student at North Carolina State University, and her colleaguesa computer game that teaches middle school students about microbiology. Based on that data, the researchers were able to tell when students were playing the game as intended and when they seemed to be disengaged because they were indulging in off-task behavior like climbing trees to reach virtual rooftops. As expected, the researchers found that students who were frequently off task learned less.
But the game had another feature that led to a more unexpected conclusion: Every seven minutes, a box popped up onscreen asking students to classify their emotional states. That led researchers to discover that off-task behavior seemed to help some students collect themselves when they got frustrated. After playing around a bit, their frustration lessened and they returned to the game.
Tool for Program Evaluation
Besides its use in studying or assessing the engagement of individual students, data mining. In a 2012 article published in the peer-reviewed journal, Educational Technology & Society, a research team led by Jui-Long Hung, an associate professor of educational technology at Boise State University in Idaho, demonstrated how this might work. The team combined student learning logs, demographic data, and end-of-course evaluation surveys to assess a supplemental, online learning program for K-12 students. The data included information from 7,539 students taking 883 courses in the program. The researchers created an engagement index that assessed how students interacted with each course that included the frequency of logins and clicks and the average number of discussion-board entries. After combining these results with other information, such as students’ grades, the researchers found that more-engaged students got better grades. But in the entry-level courses, engaged and disengaged students alike had lower performance, leading researchers to suggest that these courses might have “structure, design, and/or support issues.”
In their report, Mr. Hung and his co-authors tout the benefits of combining data mining with other information to evaluate engagement. “The result is a much richer and deeper analysis of student performance and teaching, as well as of effective course design, than could ever be accomplished with survey data or behavior mining alone,” they write.
Coverage of school climate and student behavior and engagement is supported in part by grants from the Atlantic Philanthropies, the NoVo Foundation, the Raikes Foundation, and the California Endowment. Education Week retains sole editorial control over the content of this coverage.