'Data Mining' Gains Traction in Education
The new and rapidly growing field of educational data mining is using the chaff from data collected through normal school activities to explore learning in more detail than ever before, and researchers say the day when educators can make use of Amazon-like feedback on student learning behaviors may be closer than most people think.
Educational data mining uses some of the typical data included in state longitudinal databases, such as test scores and attendance, but researchers often spend more time analyzing the detritus cast off during normal classroom data-collection practices, such as student interactions in a chat log or the length of responses to homework assignments—information that researchers call “data exhaust.”
Analysis of massive databases isn’t new to fields like finance and physics, but it has started to gain traction in education only recently, with the first international conference on the subject held in 2008 and the first academic journal launched a year later. Experts say such data mining allows faster and more fine-grained answers to education questions and ultimately might change the way students are tested and taught.
“Data resources you wouldn’t necessarily think would be useful can turn out to be very powerful for making inferences,” said Ryan S. J. d. Baker, an assistant professor of psychology and learning sciences at Worcester Polytechnic Institute in Massachusetts. For example, research from the Pittsburgh-based Carnegie Mellon University found small changes in the length of time a student took to answer individual test questions signaled the student was struggling, cheating, or had given up in favor of filling in answers randomly.
“I can easily imagine just a little bit of classroom observation data could do a lot to contextualize the other information about student achievement” in state accountability databases, Mr. Baker said.
Expanding Data Universe
In centers like the Pittsburgh Science of Learning Center’s DataShop, researchers can use advanced computers to analyze 238 data sets of online and classroom data, comprising 49 million individual student actions.
“You might be collecting thousands of data points for a single student—in some areas virtually millions—whereas the traditional qualitative methods in education psychology might have dozens or even a hundred measures,” said Arthur C. Graesser, a psychology professor at the University of Memphis and editor of the Journal of Educational Psychology.
“That changes the quantitative methods enormously,” Mr. Graesser said. “Not only can you look at unique learning trajectories of individuals, but the sophistication of the models of learning goes up enormously.”
This data hasn’t been studied in such depth before because it’s only possible to find significant results when researchers can study a huge amount of data points. For example, Mr. Baker studied a topic that has frustrated teachers for generations: students who try to get through a task without actually learning the material.
“Students spend on average 3 percent of the time gaming the system, maybe 15 [percent of students] will do it at least once,” Mr. Baker said. With only a few dozen students, it’s almost impossible to tell exactly when and how it happens, he explained, “but when you have data from thousands of students, you can.”
Studying hundreds of thousands of data points on students working through an online tutoring program, Mr. Baker created a model to allow the program to recognize when a student was attempting to complete a task without mastering the material, and then present the missed material again in a new way.
Research that draws on educational data mining may also compress the lag time between undertaking a study and getting usable results, addressing a common critique from educators.
“I think this is escalating the speed of research on many problems in education,” Mr. Graesser said. “In the past, somebody runs an efficacy study where they spend five years trying to study a sample that may include more than one classroom, and it takes a lot of time and a lot of money, whereas [an] EDM [educational data mining] study provides a far richer set of data on students in a matter of weeks or months. It’s a whole different style.”
For practicing educators, the question educational data mining raises is: Does this mean researchers could create tools for teachers that collect information in the same way that Amazon.com, the online retailer, collects information on customers’ buying habits? Could systems be developed that can track whether a student is excited about some topics but not others, struggling with decimals but not long division, and suggest interventions accordingly?
“Oh yeah, no problem! We have done that already,” said Greg Chung, the co-principal investigator of the Center for Advanced Technology and Schools at the University of California at Los Angeles. In the early 2000s, his team developed a program for the U.S. Marines that tested which soldiers were likely to have trouble with different aspects of marksmanship based on their understanding of trigger-control and then automatically assigned soldiers study materials. By the end of one week on the program, the participating Marines developed better marksmanship skills. Dr. Bror Saxberg, chief learning officer at Kaplan, Inc., said at a Dec. 7 discussion at the Washington D.C.-based think tank Education Sector that his firm is piloting similar rapid-feedback systems.
In fact, Mr. Chung and other researchers said, the technology and research can be developed faster than it takes to teach practitioners how to use it.
“Actually trying to do this in the classroom, it’s like, ugh,” Mr. Chung said. He recalled giving teachers electronic clickers that would allow every student in a class to answer a question—as opposed to only two or three in a classroom—and would allow the teacher to analyze their responses. But the sudden flurry of responses—and their range—quickly overwhelmed the teachers. “The teachers said, ‘Yeah, this is interesting, this is cool, and we learned a lot about our students, but what do you do in a class with so many different levels?’ ” Mr. Chung said. “They couldn’t address every kid.”
As data systems and the tools to analyze them become more ubiquitous, experts say we will need more research into how much and what kind of data are most helpful to teachers trying to improve their classroom instruction. Mr. Baker envisions within a generation preservice teachers will study data analysis as a matter of course, and researchers will develop easier-to-use tools to help them compare their own students’ behavior and performance to models based on hundreds of thousands of similar students.
Several states, including Louisiana and New York, are already experimenting with data tools that allow teachers and principals to track daily attendance, behavior and academic performance of each student.
In fact, a 2009 study by a team of researchers from Carnegie Mellon and Worchester Polytechnic found in the process of creating an online tutoring program that its underlying data model for tracking student progress could predict students’ year-end academic performance better than scores on the state’s standardized test.
“If we could show that a student’s work over time was a better predictor of student success than these state exams that everyone complains about anyway, wouldn’t that help us get a lot farther along?” said John C. Stamper, a systems scientist in the Carnegie Mellon Human-Computer Interaction Institute and technical director of the DataShop.
Educational data mining is catching federal attention, too. The National Science Foundation this month opened a new $30 million grant for studying cyberlearning that is intended in part to expand computer-based educational data mining projects, said Joan Ferrini-Mundy, the acting assistant director for NSF’s Directorate for Education and Human Resources. “It’s fascinating and potentially very productive,” she said.
Likewise, Aneesh P. Chopra, the nation’s first federal chief technology officer, argued at the EdSector panel that new types of data and analysis will allow researchers to use more than “static” standardized test scores to identify best practices.
“Having a debate about whether that single data point moves here or here or here sounds like a silly conversation in the face of millions of data points,” Mr. Chopra said. “We need to understand at far more granulated levels of performance what works and what doesn’t.”
Vol. 30, Issue 15
Get 10 free stories, e-newsletters, and more!
- Head of School
- Augusta Preparatory Day School, Martinez, GA
- Director of College Counseling
- Augusta Preparatory Day School, Martinez, GA
- Director of Auxiliary Programming
- Lovett School, Atlanta, GA
- Director of Technology
- St. Paul's School for Girls, Brooklandville, MD
- Chief Academic Officer
- Cristo Rey Network, Chicago, IL