Digital Tutor Nudges Students to Slow Down and Seek Help
Ask a friend to meet you at a restaurant in a city he or she has never visited, and it’s entirely possible the friend will get lost along the way. You might expect the friend to pull out a map, turn on the car’s navigation system, or even ask a gas station attendant for directions. You wouldn’t expect your lost friend to wander around town, knocking on random restaurant doors until eventually giving up and hopping in a taxi.
Yet research suggests this random searching and reluctance to seek basic help is exactly how high school students often approach problem-solving.
“Students often misuse help,” said Ido Roll, a postdoctoral researcher at the Carl Wieman Science Education Initiative at the University of British Columbia and a member of the Pittsburgh Science of Learning Center. “Either they don’t ask for help at all, or they ask for all the help there is.”
In a series of studies presented at the American Educational Research Association’s annual meeting, held here April 8-12, researchers from the Vancouver-based university and Carnegie Mellon University in Pittsburgh found that students typically will go to extreme lengths to avoid asking for help when working on computer-based tutoring programs. Yet if they learn to think about when and how to ask for help, they are more likely to avoid simply cheating to get answers.
In the classroom, it can be difficult to determine why a student does or doesn’t ask for help. Yet when students use an online program, the computer can record how fast and how often a student tries to solve a problem, uses a dictionary, or asks for help. In several studies since 2006, Mr. Roll and his research partners at Carnegie Mellon’s Human-Computer Interaction Institute—assistant professor Vincent Aleven, senior systems scientist Bruce M. McLaren, and professor Kenneth R. Koedinger—mined data from computer-based math-tutoring programs to gauge how high school students used a help button that offers progressively more in-depth hints and eventually gives the answer.
“If you get one error, 25 percent of students will ask for help; 75 percent of them will try again. And the pattern persists after any consecutive number of errors,” Mr. Roll said.
“Across hundreds of thousands of actions in multiple studies, we never get more than 30 percent of students who ask for help; they always try to do it again themselves,” he said. “When we asked students [why they didn’t ask for help], they said their parents told them ‘real men don’t ask for help.’”
‘Gaming the System’
By the time students finally did ask for help, the data showed they had given up trying to solve the problem and were aiming to cheat. Mr. Roll said 82 percent of students who used the hint tool did not stop to read it, but instead clicked through multiple hints to get to the answer.
In a 2008 study in the same series, Mr. Roll and other researchers found students often “gamed the system” by guessing and looking for the answer when they felt frustrated or disliked the subject.
Research on metacognition—the study of how students think about what they know or learn—suggests that encouraging students to reflect on their learning can lead them to use better strategies.
“Simply giving students the tools may not be enough if the extent to which the student can use a strategy depends on metacognitive skills,” said Katherine A. Rawson, an assistant professor of psychology at Kent State University, in Ohio. She studies the role of metacognition in study skills but was not part of Mr. Roll’s studies.With that in mind, in Mr. Roll’s most recent experiments, the researchers changed an adaptive, computer-based geometry tutor so that the help tool would encourage students to reflect on their problem-solving strategies.
For example, if a student had answered incorrectly several times in quick succession, indicating he or she was guessing, a help window could pop up noting that the student seemed to be struggling and pointing out the glossary and hint buttons. Or if the student hit the hint button repeatedly without enough time to read the hints, suggesting he or she was just looking for the answer, a help window would encourage the student to slow down and think about the hint.
He tested the new system first with 58 10th and 11th graders in two urban classes with high concentrations of students from minority groups and two suburban, low-minority classes, and then in a second experiment with 67 students in a rural high school. In each trial, half the students used the updated tutoring program and half used the standard program.
In the first trial, students who used the hint tool prompting them to solve problems more reflectively used hints inappropriately 10 percent less often than those who got no feedback. In the control group, 70 percent of the time that students pushed the help button, it was to “game the system” by clicking straight through to the final answer; by contrast, only 48 percent of the pupils on the help tutor did so.
The second trial had similar findings: Students in the control group used the hint button to cut to the final answer 60 percent of the time, while students who used the revised tutor tried that approach only 45 percent of the time. The students using the adaptive help tutor also took more time to consider one hint before requesting another and solved the problem with lower-level hints than the control students did.
Vol. 30, Issue 28, Page 11
Get more stories and free e-newsletters!
- Program Officer, Teacher Development
- Knowles Science Teaching Foundation, Moorestown, NJ
- Eugene School District 4J, Eugene, OR
- Principal - Secondary (Pool)
- Jefferson County Public Schools, Golden, CO
- School Based Therapist
- Okanogan Behavioral HealthCare, Omak, WA
- Southeast Polk Community School District, Pleasant Hill, IA