Researchers in Georgia have developed two new tools designed to better understand and detect autism, including a system that uses glasses to track where children look and facial-analysis software to identify when a child makes eye contact with the person wearing the glasses.
That device, developed at Georgia Tech’s Center for Behavior Imaging, uses a commercially available pair of glasses that records the focal point of their wearer’s gaze. In a study at the school’s Child Study Lab, researchers took video of a child captured by a front-facing camera on the glasses, which were worn by an adult interacting with the child. The video was then processed using facial-recognition software. The result is a system able to detect eye contact in an interaction with a 22-month-old with 80 percent accuracy, the university said. Here’s a video of what this looks like:
Children at risk for autism often avoid making eye contact. Discovering an automated way to detect this characteristic and other distinct behavioral markers could be a significant step toward scaling autism screening up to much larger populations than those currently reached, Georgia Tech researchers said. Working on this tool and another are the goals of a grant from the National Science Foundation Expeditions program that Georgia Tech received in 2010.
“Eye gaze has been a tricky thing to measure in laboratory settings, and typically it’s very labor-intensive, involving hours and hours of looking at frames of video to pinpoint moments of eye contact,” said Jim Rehg, the director of the Center for Behavior Imaging and a professor in the School of Interactive Computing, in a press release. “The exciting thing about our method is that it can produce these measures automatically and could be used in the future to measure eye contact outside the laboratory setting. We call these results preliminary because they were obtained from a single subject, but all humans’ eyes work pretty much the same way, so we’re confident the successful results will be replicated with future subjects.”
The other new system is a package of sensors strapped onto the wrists and ankles that uses accelerometers to detect movement. Algorithms analyze the sensor data to detect and classify whether certain behaviors are aggressive, self-injurious, or disruptive, such as throwing objects. (This was developed in collaboration with the Marcus Autism Center in Atlanta and Newcastle University in the United Kingdom.)
Researchers developed the algorithms by putting the sensors on four Marcus staff members who collectively performed 1,200 different instances of behavior. The system detected “problem” behaviors with 95 percent accuracy and classified all behaviors with 80 percent accuracy. When the sensors were tested on children diagnosed with autism, the sensors detected their behavior episodes with 81 percent accuracy and classified them with 70 percent accuracy.
“Our ultimate goal with this wearable sensing system is to be able to gather data on the child’s behavior beyond the clinic, in settings where the child spends most of their time, such as their home or school,” said Agata Rozga, a research scientist in the School of Interactive Computing and a co-investigator on the Expeditions award, in the release. “In this way, parents, teachers, and others who care for the child can be potentially alerted to times and situations when problem behaviors occur so that they can address them immediately.”
Georgia Tech told me that the Child Study Lab is looking for more children—with and without autism—to volunteer as subjects for this research.