When Computers Can Read You Like A Book
Emotionally sensitive computers aren't anything new at UC San Diego's Early Childhood Education Center. For the past 10 years, toddlers here have been playing with a robot named Rubi.
Rubi kinda looks like a desktop Teletubby. She has a computer screen for a face and she likes to ask questions like “where is the donut?” When kids tap the donut image correctly on her iPad belly she exclaims, “Excellent!”
But it's what's hidden inside Rubi that makes her really useful. Rubi contains three tiny cameras surveying the classroom. The cameras are able to identify each child's face, and by reading split-second facial movements called micro-expressions, Rubi perceives when they're feeling joyful or when they're angry, sad or frustrated.
Instructors could one day use this emotional data to reprogram playtime or to spot behavioral problems early on. But this technology, called expression recognition, has graduated already from preschool and entered the marketplace.
A group of researchers in this field founded a company recently in San Diego called Emotient. They're trying to commercialize expression recognition. They've already found a foothold in advertising.
"We put a camera up on the wall. And we were able to measure their facial responses to the Super Bowl ads,” Emotient lead scientist Marian Bartlett said about their focus group with a few dozen Super Bowl viewers.
“And (we) have aggregate information about how they responded to the ads, how they responded over time, and what was their emotional journey as they watched some of the different advertisements," she added.
Expressions of joy peaked a lot during an ad for Cheerios. Less so for Denver fans, but still, people liked it.
I wanted to test out the Emotient system, so I stepped in front of the camera. On a big display screen, I could see my face isolated from the background and surrounded by a little gray box. The sight made me chuckle, after which the box turned blue.
A blue box means joy.
I could see why advertisers would want this information. When people fill out surveys, they don't always tell the whole truth. And while people might try to lie with their faces, a computer probably will see through it. In a study published earlier this month, Marian Bartlett and her colleagues proved that computers can tell when humans are faking pain better than humans can.
Having a window into people's unfiltered reactions could help advertisers craft commercials for maximum emotional impact. Emotient even lets them target ads to specific demographics.
"We automatically detect gender so we can automatically split the results," Bartlett said.
Other Emotient researchers are piloting different projects like an emotion-reading app for Google Glass, the computer you wear like a pair of shades.
"OK Glass, show emotions," researcher Josh Susskind said, who let me wear the Emotient-enabled Glass.
I looked at him, and at the edge of my vision, I could see the box around his face changing colors along with his expressions. Susskind says these color-coded cues might help people with autism who have a hard time seeing emotion.
He said they might also be handy for people who are irritating others by wearing Google Glass in a social setting. Angry colors could indicate people think you’re being a glasshole and should take them off.
But the big money for Emotient is in retail. For example, Bartlett says department stores could use cameras to pinpoint exactly when and where their shoppers feel most frazzled.
"We could get information about how a particular employee is doing and whether that employee needs feedback or training on customer service."
But gauging customer emotions like this sounds a bit creepy to Beth Givens, director of the Privacy Rights Clearinghouse.
"People don't like being manipulated,” Givens said. “And they certainly don't like, you know, their minds being read. This gets very close to mindreading. And frankly I think if consumers knew this was being used, say, in a retail environment, they would be quite unhappy."
Emotient says they don't store videos or even still images of the faces they're reading. They only deliver anonymous emotional data.
They take such precautions because even they admit if expression recognition was synced up with personally identifying information, the ability to profile and target individuals based on how they're feeling could be abused.
Feeling uneasy about all that? Emotient could probably tell if you are.