Saturday, May 16, 2015

Say hello to machines that read your emotions to make you happy

Read full article
Continue reading page |1|2

The latest AI devices can't just gauge your mood from what you say and how you say it, they can work out the best way to respond to cheer you up

"BRIAN? How are you, Brian?" The voice is coming from a screen dominated by a vast blue cartoon eyeball, its pupil dilating in a way that makes it look both friendly and quizzical. Think HAL reimagined by Pixar.

This is EmoSPARK, and it is looking for its owner. Its camera searches its field of view for a face and, settling on mine, asks again if I am Brian. It sounds almost plaintive.

EmoSPARK's brain is a 90-millimetre Bluetooth and Wi-Fi-enabled cube. It senses its world through an internet connection, a microphone, a webcam and your smartphone. Using these, the cube can respond to commands to play any song in your digital library, make posts on Facebook and check for your friends' latest updates, stream a Netflix film, answer questions by pulling information from Wikipedia, and simply make conversation.

But its mission is more complex: EmoSPARK, say its creators, is dedicated to your happiness. To fulfil that, it tries to take your emotional pulse, adapting its personality to suit yours, seeking always to understand what makes you happy and unhappy.

The "Brian" in question is Brian Fitzpatrick, a founding investor in Emoshape, the company that makes EmoSPARK. He and the device's inventor, Patrick Levy Rosenthal, compare EmoSPARK's guiding principles to Isaac Asimov's laws of robotics. They are billing the cube as the world's first "emotional AI".

But EmoSPARK isn't the first robotic agent designed to learn from our emotions. There's Jibo the familyMovie Camera robot and Pepper the robot companion. Even Amazon's Echo voice-activated controller might soon be able to recognise emotions.

The drive to give artificial intelligence an emotional dimension is down to necessity, says Rana el Kaliouby, founder of Affectiva, a Boston-based company that creates emotion-sensing algorithms. As everything around us, from phones to fridges, gets connected to the internet, we need a way to temper machine logic with something more human.

And when the user is immersed in a world that is as much computer as real life, a machine must learn some etiquette. For example, you shouldn't come home from a funeral to find your AI itching to tell you about the latest Facebook cat videos.

How can a machine be trained to understand emotions and act on them? When EmoSPARK's webcam finds my face, a red box flashes briefly on screen to indicate it has identified a face that isn't Brian's. Behind the scenes, it is also looking for deeper details.

EmoSPARK senses the user's emotional state with the help of an algorithm that maps 80 facial points to determine, among other things, whether he or she is smiling, frowning in anger or sneering in disgust. EmoSPARK also analyses the user's tone of voice, a long-established method of mood analysis.

Having sensed these details, EmoSPARK uses them to mirror your emotions. First, it creates an emotional profile of its owner based on the combination of facial and voice input. At the end of each day, it sends this information to EmoShape, which sends back a newly tailored emotional profile for that particular device. Through this feedback loop, Fitzpatrick says, the cube's personality changes ever so slightly every day.

Hard problems

Rosalind Picard at the Massachusetts Institute of Technology is sceptical that this can produce an accurate emotional profile. Picard, who designs facial and vocal analysis software to help computers interpret emotion, and co-founded Affectiva with el Kaliouby, says there's more to understanding moods than mapping points on the face. "What does it know about the context? How much data is it trained on? How is it being taught the true feelings of the person? These are still hard problems to solve."

The algorithm used by EmoSPARK isn't necessarily all that sophisticated. Coaxing it to register a user's smile requires a toothy grin in good lighting; real-world conditions, for most people, don't live up to that.

But maybe you don't need a million-dollar algorithm. One aspect of creating "emotional" AI requires neither hardware nor software: it's just a matter of exploiting what our brains do naturally. "We anthropomorphise everything," says Eleanor Sandry at Curtin University in Perth, Australia. Humans project intent and emotions on to anything from dolphins to Microsoft's paper clip. We can't help ourselves.

And EmoSPARK pulls out all the stops to put this tendency to work. To calibrate your cube, you undertake a ritual which ensures that only one person can be emotionally bound to it. "Are you the person I am to bond with?" is its first question. Although it will recognise other individuals in the same house or building, it only creates the emotional profile for its owner.

That doesn't mean it can't interact with anyone else. When someone who is not Brian taunts it, saying "I don't like you", EmoSPARK manifests its displeasure with a pulse of green light that shudders through the cube. "It's funny, I don't like you that much either," it responds. If EmoSPARK had been complimented, it would have glowed purple.

Read full article
Continue reading page |1|2

If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.

post from sitemap

No comments:

Post a Comment