The British aren't usually known for displaying their emotions. But they may start revealing their moods—and much more—thanks to a University of Cambridge scientist. Professor Peter Robinson has developed a "mind-reading" computer that can interpret reactions and feelings by analyzing a person's facial movements.
Developed in conjunction with researchers at the Massachusetts Institute of Technology, the computer uses a camera to capture people's facial expressions and then applies sophisticated pattern-matching technology to recognize emotions ranging from confusion to concentration.
The potential applications go well beyond an interesting experiment. Automakers, online retailers, and teachers are interested in the potential commercial and educational benefits of the mind-reading computer, which could enable the use of more personalized and adaptive products, services, and learning experiences. It could even be a boon to people with autism or Asperger syndrome, by helping them interpret the emotions of others.
To be sure, some critics find the whole idea pretty creepy. "The technology itself raises privacy issues relating to the increased collection of personal information," says Gary Ruskin, the executive director of Commercial Alert, a U.S.-based group that monitors the relationship between civil society and commercialism. He adds that computer interpretation of emotions could lead to 1984- type scenarios for marketers and governments.
Still, that's a ways off. Computers that are truly emotionally aware are still more science fiction than fact. But Robinson is confident that his machine can correctly gauge a limited range of human feelings. "The computer is 85% accurate when analyzing data provided by actors," Robinson said while unveiling the computer at the Royal Society in London in early July.
At present, the machine is only correct 65% of the time when evaluating the emotions of regular people in real world situations. Robinson and his team plan to improve this figure by creating a database of everyday events that the computer can interpret. The key to the computer's swami-like powers is its ability to identify 24 facial feature movementss—from an raised eyebrow to a furrowed brow—that, when combined, allow it to identify a person's mood.
Robinson is excited about the practical applications of his invention, and is working with several companies to transform the mind-reading computer from laboratory experiment into a money-making reality.
In particular, he is in talks with an unnamed Japanese car manufacturer to incorporate the technology over the next five years into future models. With a camera built into the dashboard, the car could monitor a driver's emotional state and react to unforeseen problems. For instance, if a driver started to nod off, the car could emit a loud noise to awaken him.
Connected to a car's satellite navigation, the system could provide directions and adapt itself to how a person is feeling. "The system has the potential to prevent accidents by improving driver safety," Robinson says.
Automakers are also interested in integrating Robinson's computer into their new models to enhance the driving experience: When you're sad, the car could play happy music, or if you're confused, it could turn off distracting gadgets.
Robinson's technology also isn't the only one auto makers are investigating. Toyota is reportedly working with researchers from Stanford University and Edinburgh-based Affective Media to incorporate mood sensitivity into cars. By monitoring the tone of drivers' voices, the technology senses if drivers are stressed, angry, or sleepy—and reacts, for instance, by playing soothing music or proposing better routes to escape traffic.
At this point, Robinson concedes, his Japanese partner is focusing more on the invention's aesthetic qualities than on safety benefits. "It would be more a prestige thing," he says. "An extra fancy gadget" to be included on luxury cars.
Robinson also has gotten inquiries from online retailers and computer service providers, such as IBM, who envision tailoring their products to the emotional state of consumers. While surfing the Web, for instance, your computer could determine if you liked certain products and then modify content to your individual tastes or alter advertising to fit your mood.
The use of Robinson's emotionally aware technology to improve company sales represents the latest advance in neuromarketing—the study of the brain's response to marketing to measure consumer preferences. "Neuromarketing can help predict what products people are going to choose," said Dr. Gemma Calvert, director of Neurosense, a British consulting firm.
Another application for the mind-reading computer is as an "emotional hearing aid" to help people with autism and Asperger syndrome, who have difficulty reading others' emotions. Robinson's MIT partners are designing a prototype headset that informs the wearer of people's moods, and are currently improving its accuracy by recording individuals' reactions to everyday events.
Robinson is keenest about his invention's applications in online teaching—an area he has been focusing on in his Cambridge lab. By analyzing students' emotions, a computer could calculate whether they understand what is being taught and then tailor lesson plans to improve comprehension.
Scenarios such as these invariably raise questions about privacy and a Big Brother-like society. Commercial Alert's Ruskin charges that the technology could permit marketers and governments to gather information about people without their knowledge. It's "a chilling thought," he says. Of course, people could cover up the cameras on their own computers, but escaping the watchful eye of a discreet public camera could be more difficult.
The Cambridge professor is adamant that his mind-reading computer won't discern someone's innermost thoughts. Instead, he says it will let computers adapt to users' needs by monitoring their moods. In the future, Robinson's machine may even shed some light on the emotional state of the British.