upper waypoint

Google Glass Flopped. But Kids With Autism Are Using It to Recognize Emotions

Save ArticleSave Article
Failed to save article

Please try again

Gabby Warner is a participant in Stanford's Google Glass Autism study. She hopes the project morphs into a commercial enterprise that will sell learning aids to help kids with autism.  (Lindsey Hoshaw/KQED)

Gabby Warner coveted a Google Glass headset when it first became available so she could search the web. But now, she's using it for a more profound purpose.

"It’s helped me to understand some people’s emotions," says Warner, who has autism. "I can tell when a friend is upset better now than I could before."

The 14-year-old attends Westmont High School in Campbell, California and is one of the first study participants in Stanford's Autism Glass Project. Currently, children ages six to 17 are participating, and researchers are looking for more volunteers for the next phase.

The goal is to validate the technology as a learning aid that helps students recognize human emotions.

Some children with autism struggle to understand social interactions, make eye contact or recognize facial expressions. So the Stanford researchers developed facial-recognition software specifically for Glass. The software acts as a coach, helping the kids search for and correctly identify emotions expressed on people's faces.

Sponsored

The technology could impact millions of children. In 2014, one in 68 children was diagnosed with autism, according to a report from the Centers for Disease Control and Prevention. That's up about 30 percent from the previous estimate in 2012.

Early results of the Stanford study have shown benefits.

Study manager Beth McCarthy shows study participant Gabby Warner new features on the smartphone app that connects to Google Glass.
Study manager Beth McCarthy shows study participant Gabby Warner new features on the smartphone app that connects to Google Glass. (Lindsey Hoshaw/KQED)

"We are really seeing improvements in social acuity," says Stanford University associate professor of pediatrics Dennis Wall. Parents, he said, are also spending more focused therapeutic time with their kids.

Wall is leading the project with entrepreneur and Stanford University computer science undergrad Catalin Voss, at the Wall Lab, housed within Stanford's School of Medicine. The team started phase I clinical trials in February 2015 by having students wear Google Glass and an eye-tracker while they looked at a monitor.

During phase I, the screen showed an image and then a list of seven emotions from which the kids could choose. The first round of images was used for a baseline test, where the kids tried to identify the emotions on their own; in the second round, Glass provided them with audio and visual cues, such as emoticons or playful sound effects, which were linked to different emotions.

In the third round, the students again identified the emotions on their own. Researchers wanted to see if they were any better at it having used Glass.

Wall said the participants could better differentiate the emotions they saw and were less likely to confuse emotions like surprise and fear. They also looked more frequently at people's faces and made more eye contact. (The researchers couldn't provide KQED statistics on the study because a published scientific paper is forthcoming.)

"They were generally more socially confident," says Wall, "and a side benefit is that they feel less isolated."

Taking Glass Home

But a lab setting is nothing like the real world. So the study is now sending children home with Glass to measure how they interact with their families and record changes in their social skills.

Over the course of four months, the kids and their families will participate in the therapy. During three 20-minute sessions per day, anything a child sees while wearing Glass is recorded and saved onto a smartphone app developed by the lab.

Kids and parents can then review the footage together, and the parents can point out the emotions they were feeling at specific moments. As this occurs, corresponding color-coded bars at the bottom of the screen are linked to those feelings. A red bar at the bottom of the screen means someone is upset, for example, and a yellow bar indicates they are happy. In the photo below, the blue dot is centered over a yellow bar.

The smartphone app that kids use to review human interaction color codes different facial expressions.
The smartphone app that kids use to review human interaction color codes different facial expressions. (Autism Glass Project)

These color-coded videos help kids remember what emotions they saw and in what context.

"As we do this face tracking, we start to develop some very quantifiable metrics that weren’t available in autism before," Voss says. "Now we actually know how much you are looking at your mom." Potentially, he says, the lab could send a weekly eye contact report to families who want to know if there's been improvement.

Gabby Warner has been using Glass at home for almost two months, and it's helped her socialize at school.

"I've been applying what I saw with the Google Glass to situations without the Google Glass," Warner says. "I would see my friend’s face and it would look similar to one of the faces I saw on my parents when they were upset so then I could ask my friend, 'What happened?'”

The end goal is to make a product that is commercially available and medically reimbursable through health insurance. Researchers say they hope to have a new product could be on the market within two years.

"There are so many kids we have to reach," Wall says, "not just kids in Silicon Valley but globally."

lower waypoint
next waypoint