A new series of studies using both toddlers and a sophisticated learning robot suggest that that body posture can aid memory when learning new things.
In a series of experiments published in the journal PLOS-One, cognitive scientists from Indiana University Bloomington and the University of Wisconsin-Madison partnered with the Center for Robotics and Neural Systems at U.K.-based University of Plymouth to track how people and machines learn how to match a new word to an object—one of the earliest steps in language.
“We know that older children and sometimes even toddlers look like they can learn a word in a single bound ... but now it’s become clear that there’s much more to it,” said Linda B. Smith, a professor of psychological and brain sciences at Indiana and a co-author of the study. “Babies don’t hold onto a single word. It’s fragile. Those contexts often have to be constrained by the child’s own body.”
The studies could give new insights, not just into how young children develop language, but also into how to help students when that process goes wrong. Many learning disorders—from austim to attention deeficits to Down syndrome—all have associated sensory disruptions. Understanding the connection between problems with spatial orientation and learning could help educators develop new interventions for these students, she said.
“Could these sensory motor disruptions really be a factor in making those connections [in learning]? We’ve had similar findings in adults, but ... something that shows up only a little in adults might be a really big deal in little kids, because they move more and their learning is much more fragile,” she said. “These are potentially very exciting results.”
Bodies in Space
In one set of experiments, a robot was shown a target object on one side of the table, far enough from the middle that it had to turn to look down at it. It was then shown another object on the opposite side of the table, requiring it to position its eyes, head, and torso in a different way. After a few repetitions, the objects were taken away, and the researcher directed the robot’s attention so that it would position itself as it had when the first object was presented, while a researcher said a new word, “modi.” Later, the robot was presented with both objects in a different location, and asked to find the “modi.”
When the researchers presented the two objects in a way that forced the robot to mimic its earlier postures, the robot was quicker to learn to associate a new word with the correct object. Moreover, when the incorrect object was placed in a way that made the robot position its body as it had for the target object, it chose the incorrect object more than half the time.
In the second half of the study, researchers replicated the same experiments using 18-month-old children. Toddlers had an even stronger connection than the robot between their body orientation and how easily they learned to connect a word and an object. And like the robot, the toddlers were more likely to associate the incorrect object with the word when the location of the two objects was switched.
“During that moment when you are trying to build a memory that will last, that will go beyond the moment, spatial consistency matters,” Smith told me. “That kind of model doesn’t mean you only know things in that body position; you can generalize across the body positions; holding objects, manipulating objects all help.”
The researchers are now broadening their study to look at whether body posture may be connected to previous studies showing that children develop words associated with specific places. “We’ve been thinking a lot about things like meal time versus bath time, because they both have their own positions and postures,” she said. “That is a great unanswered question.”
National Robotics Week starts April 4, and this study is among the first to show the potential such technology can hold for understanding more about teaching and learning. So far, most of the partnerships between education researchers and roboticists have gone the other way, with researchers applying findings on how children learn to better program machines to learn naturalistically.
The video below shows one of the experiments in action.
- “Give Math a Thumbs-Up! Gestures Boost Learning, Study Finds”
- “Studies Find Students Learn More By Acting Out Text”
Photo: A robot identifies a ball as part of a series of experiments on how body posture affects child and machine learning. Source: Linda Smith, Indiana University
Want more research news? Follow @SarahDSparks on Twitter for the latest studies, and join the conversation.
A version of this news article first appeared in the Inside School Research blog.