I got a glimpse of the future of shopping at this week’s National Retail Federation conference, and it revolves around some very impressive and unsettling technology. There was technology that collects digital currency (otherwise known as coupons) if users agree to watch commercials instead of skipping them. New facial-recognition software claimed to determine gender with better than 90 percent accuracy and age range with about 70 percent accuracy. And, most interesting to me, there was a demonstration of facial-expression recognition software that can read some of our emotions. It was developed by a company called Emotient; Intel Capital, the venture capital division of Intel, is an investor.
I stood in front of a screen with a camera and made faces for a couple of minutes as the chief executive officer of Emotient, Ken Denman, watched. The software is supposed to be able to identify seven of our greatest—and mostly negative—emotions: sadness, anger, fear, disgust, and contempt, as well as surprise and joy. It’s based on the work of Paul Ekman, a psychologist who studied a remote tribe in Papua New Guinea in the 1960s and found that at least some facial expressions of emotion are universal. Ekman, who’s an adviser to Emotient, also discovered “microexpressions” that could be used to reliably detect concealed emotions. From that, he developed the Facial Action Coding System. “It’s an alphabet of facial expressions,” says Denman. The software can learn, too, so it will become more precise and sophisticated the more often it’s used.
The software works with any reasonably sharp camera and could be used in all kinds of research, in marketing, focus groups, and stores, says Denman. Procter & Gamble is already using it to help with its market research. The problem, Denman says, is that people don’t always tell the truth on surveys, or face to face. “People will show contempt or disgust, but they won’t tell. Microexpressions are the real truth.”
In one example, P&G tested three fragrances for its Tide detergent with a focus group. At the end, everyone answered questions about the samples and were told to choose one of the test products to take home. “The surveys weren’t indicative” of which fragrance the respondents actually preferred, Denman says. “Our results were. We get the gut reaction.” In this case, the results showed an admittedly not-very-surprising correlation between the product participants chose and the one that provoked the least negative reaction among participants. P&G didn’t respond to a request for comment.
Denman says the software could also be used in stores (with cameras placed throughout) to improve customer service. “The worst thing is if customers are angry and no one notices,” he says. “This measures the level of emotion across the store. It’s a level of mathematical evidence. If customers are angry, managers could put more employees on the floor or give out samples.”
Denman says that one of the country’s largest fast-food chains is using the software, though he’s not able to say which one or how the company’s using it. I was disappointed that he couldn’t tell me more. Denman picked up on that right away. One day his emotionally aware software might, too.