Intelligent Machines

Digital Summit: First Emotion-Reading Apps for Kids with Autism

Software meant to help people interpret emotions will soon be available in several apps.

Jun 9, 2014

The first mobile apps that use emotion-reading software to help kids with autism are nearing release, a startup reported today at the MIT Technology Review Digital Summit in San Francisco.

Emo-app: Rana el Kaliouby says the first apps that use facial emotion-reading software are coming soon.

One of the apps is a game that challenges kids to match a face to the emotion it is projecting, said Rana el Kaliouby (see “Innovators Under 35: Rana el Kaliouby”), chief science officer of Affectiva, which is based in Waltham, Massachusetts. Another will allow people to submit face pictures, such as “selfies,” and get a readout on the mood of the person in the photo. This could be used to for social sharing of people’s moods in different locations. A third, intended for anyone, would allow people to make music with facial expressions: raise or lower eyebrows to make a tone rise or fall, or smile or frown to make music that sounds happy or sad.

Affectiva grew out of emotion-detecting research at MIT’s Media Lab. The company’s software, called Affdex, analyzes images of faces to detect features such as smiles, frowns, raised eyebrows, furrowed brows, and smirks. Though the early academic research focused on applications such as helping people with autism, so far the technology has been used commercially to help marketers understand whether ads are effective (see “Startup Gets Computers to Read Faces, Seeks Purpose Beyond Ads”).

Then, last year, the company released the software to app writers for iOS, the operating system used in iPhones and iPads. And now the first apps are coming, said el Kaliouby. “Autistic kids have trouble reading and understanding social and emotional cues,” she said. “Just as people with hearing problems benefit from a hearing aid, people with social and emotional problems can benefit from systems that help them understand emotions. We started out with research on autism, and we went out and did this commercial stuff.”  But now, she says, others “can take it and apply it back to autism again.”

The advertising work helped make the software more accurate by “training” it, she added. After three years analyzing faces seen on webcams, Affectiva’s database now holds more than a billion facial expressions.