Underlying much of the ed-tech world is a belief that collecting and mining massive amounts of data can help educators better understand who their students are, what they have learned, and what instruction and supports they need.
But does it work? Or is it mostly hype?
Three leaders in the field discussed those questions Tuesday at the annual conference of the American Educational Research Association, being held here. I was in the room live-tweeting; here’s the thread in case you missed it.
Good morning again from #AERA19...tweets to come shortly from a session titled “Can Learning Analytics Improve Learning?” feat. Ryan Baker of @PennGSE, Mary Ann Wolf of @FridayInstitute, and David Michael Niemi of @KaplanNews 1/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Update: Niemi is now “primarily helping @ChanZuckerberg Initiative” with his work on learning science/analytics 2/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Baker of @PennGSE: Online learning platfoms not being used appropriately in schools -- “they’re designed for year-long use, but getting used for 2 weeks then abandoned.” 3/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Key purposes of learning analytics currently, per Baker: “Automated detection of learning, engagement, emotion & safety;" better reporting for admins/institutions; dropout/success prediction; basic discovery prediction. 4/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Most systems still use relatively crude Bayesian approach to track mastery, Baker says. But recent growth of new algorithms looking to model inquiry, conceptual understanding, computational thinking, behavior. 5/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Eye-opening summary from Baker @PennGSE: Learning analytics related to engagement/affect in middle school can predict standardized test scores, HS course-taking, college attendance & major, even first job post-college. 6/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Researchers not just testing positive nudges (e.g. growth mindset messages.) Also testing “shake-up” messages, in which software might tell student “Looks like you’re frustrated. That’s just God’s way of showing you you’re not smart enough to learn this.” 7/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Now up, David Michael Niemi. Learning analytics must be based on valid measurements of learning...but also provide understandable info to people who can act on it, plus guidance on which actions will be” most effective, for whom, when”. 8/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Tech people often don’t have sufficient background in education/learning measurement, says Niemi. “As we try to exploit all these new data sources, that doesn’t mean we should forget everything we know about learning.” 9/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Guide for modern learning analytics, per Niemi: Biggest payoffs may come from focusing short term, on helping students get through day-to-day tasks, and from focusing on attributes that can change. 10/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Relative to standards mastery, things like students’ prior knowledge, belief that they can learn, emotional state, etc. are under-explored, says Niemi. Need to generate “multi-dimensional views” of students. 11/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Re: making learning predictions from demographic profiles: “Tricky thing is to figure out how to help individual students without labeling them in some negative way,” Niemi says. Not good: just giving teachers info that certain groups of students won’t likely perform well. 12/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
One of key insights from big-data analysis of 1,000 online courses: It’s valuable to just get students to do *something* meaningful, like logging in or opening up a first assignment. Don’t have to encourage students to do everything all at once. 13/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
(I should note here, also, that this #AERA19 session is jam-packed, with a rapt audience of current & aspiring researchers.) 14/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Batting third: Mary Ann Wolf of @FridayInstitute, sitting at the intersection of research, policy, and practice. Going to add some perspective on what learning analytics look like in the real world of K-12. 15/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Describing model N.C. elementary school with exemplary use of learning analytics data, Wolf says: “The first thing you notice is you can’t find the teacher...not because they’re not there, but because they’re so engaged with the kids they almost blend in.” 16/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
What capacity do schools/districts need to make effective use learning analytics? Culture of data-informed decision-making, adequate tech infrastructure, human capital, professional learning opportunities for staff, says Wolf. 17/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Fine line between “keeping students safe & cutting off opportunities to understand them better,” Wolf says. If public doesn’t understand learning analytics, may “make decisions out of fear.” 18/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Pondering: Is it a contradiction to be freaked out by comprehensive data mining on students’ affect/emotion/mindset while agreeing w/ Amy Stuart Wells #AERA19 presidential address about fundamental problems of standardized test scores as be-all indicator of student learning? 19/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Put another way: Are clickstream data, the “digital ocean” described by @KristenDiCerbo @pearson, signals of student cognition, discourse data, & all these other forms of learning analytics a potential way out of the constraints of standardized testing? 20/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Baker cautions there’s a long path from researchers looking at ways to algorithmically model student learning and commercial adoption. SXSWedu, he says, is full of vendors bragging about using learning algorithms from 1995. 21/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Baker of @PennGSE on consent re: embedded nudges/messages:
* Shake-up messages described above were tested in a lab study w/ IRB approval.
* Learning analytics different than commercial A/B testing, bc of “beneficient” motive (improve learning outcomes) instead of profit. 22/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
More from Baker:
“Having to get consent every time you change something in adaptive learning would be essentially impractical. We should have oversight, but don’t tamp it down to point where we make it impossible to improve education.” 23/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
Another tricky ethical question from Baker (referencing recent paper by others, sorry I missed the names): Should universities be required to screen students for risk & intervene based on learning analytics, bc it wld be unethical to ignore potential info that could help? 24/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
A final dose of realism from Baker: Learning analytics probably is a fad & the bubble that will burst. But that doesn’t mean it’s not potentially useful & will just go away.
“If things go well, in 15 years, learning analytics will just be part of how we do education.” fin/
-- Benjamin Herold (@BenjaminBHerold) April 9, 2019
See also: