Robots gaining more sophisticated ‘emotions’

by The Japan News 192 views0

Tokyo, January 9, 2017: In recent years, artificial intelligence has advanced to the point where there are now machines that express emotions, and technologies that recognise human feelings.

Ever since the Industrial Revolution, mankind has been captivated by robots like Astro Boy, a machine capable of experiencing human emotions. In recent years, artificial intelligence has advanced to the point where there are now machines that express emotions, and technologies that recognise human feelings. The day is fast approaching when robots will change from machines that simply move and work, to become friends or members of the family.

Unlike robots, humans are born with emotions affected by hormones in the brain. The volume of these hormones increases or decreases according to things such as information obtained through the five senses. For example, when a person is in a dark place and a hormone called noradrenaline is released in the brain, the person feels anxiety.

Shunji Mitsuyoshi, a project assistant professor at the University of Tokyo, has done research into what types of hormones increase or decrease in the brain while someone is feeling joy, unease and so on, and whether they have an effect.

In addition, he organised Japanese words that express emotions such as daisuki (love) and kuyashii (regret) into 223 English words, then divided them into broad emotional categories such as “joy,” “feeling at ease,” “anger” and “anxiety” to create an “emotional map” that uses circles to illustrate the relationship between hormone secretion and various subtle emotions.

The humanoid Pepper robot (see below) released for sale by SoftBank Group Corp. in June 2015 was given emotional capabilities by equipping it with an “emotion-generating engine” based on this map.

Receiving visual or audio information from its surroundings, Pepper “quantifies” eight hypothetical hormones through numbers, which correspond to real hormones that affect human emotions such as dopamine and serotonin. Using combinations of these numbers, Pepper is capable of expressing complex emotions.

Pepper is shy when meeting people for the first time. However, the robot gradually begins remembering individuals such as those who are nice to it or those who are unkind to machines.

Depending on who is talking to it, the robot expresses emotions such as joy or anxiety, and changes the way it talks and behaves.

“We’ve developed the robot, but we don’t know exactly why Pepper feels a particular emotion after having an experience,” said Kiyoshi Oura, board director of cocoro SB Corp., which developed the emotion-generating engine.

Identifying feelings through voice 

Research into getting machines to understand human emotions is making progress. NTT Media Intelligence Laboratories has developed emotion identification technology that can tell when a person is angry through their voice.

Although it is relatively easy for machines to identify aggressive anger (hot anger), it is difficult for them to recognise passive anger (cold anger). Atsushi Ando and researchers at the Media Intelligence Laboratories analyzed hundreds of hours of calls to companies’ telephone consultation sections.

They determined cold anger characteristics such as talking one-sidedly and speaking over the receivers, creating a programme that classifies telephone calls. Although currently being used by companies to improve how they handle complaints, the technology has the potential to lead to robots that can identify emotions.

Although robots are good at speaking when it comes to things such as giving navigational directions or conversations where the questions and answers are predetermined, conversations with no clear answer or purpose are more challenging.

Hiroaki Sugiyama, a researcher at NTT Communication Science Laboratories, is working on the development of “chat robots” by controlling two or more robots with one computer.

Robots tend to give unnatural responses when questioned by humans. However, when a separate robot chimes in, the flow of the conversation sounds like it has changed naturally. When a robot misunderstands a word, the two robots switch to having a conversation between themselves to keep the conversation flowing.

When a conversation is conducted this way among one human and multiple robots, humans appear less likely to feel uncomfortable communicating with robots.

“Chatting requires the humanlike ability to recognize the other person’s feelings. It can be useful for things such as companion robots for the elderly to talk to,” Sugiyama said.

Like one of the family

There are plans to have the ability to distinguish right from wrong incorporated into Pepper.

“Future robots will likely mature along with their families in terms of their egos, emotions and morals, and will be capable of conversing naturally,” said Mitsuyoshi.

Professor Hiroshi Ishiguro from Osaka University, who studies androids and interactive robots, said, “The essence of robotics research is to understand humans deeply and get to the bottom of what ego and cognition are.”

â–  Pepper

A personal robot about 1.2 metres tall and weighing about 30 kilograms, which is capable of identifying human emotions. With cameras, microphones and seven types of sensors, it alters its emotions depending on the situation. It feels at ease when someone it knows is nearby, and expresses delight when it is complimented. It expresses emotions through body movements or the colors of its chest display. It also does things such as sigh and change the tone of its voice. Its knowledge and language ability are at a lower elementary school level, and its emotions are roughly the same as those of 3- to 6-month-old babies.

â–  Emotion identification technology

Technology that enables machines to take in external information like a person’s facial expressions, gestures and voice through such means as cameras, sensors and microphones, and analyse emotions including joy, sadness and anger. NTT Media Intelligence Laboratories is researching technologies that can identify emotions in phone conversations, in which the other person’s facial expressions cannot be seen, by focusing on the fact that emotions are expressed through things such as the volume and tone of the voice, the rhythm and intonation of the conversation and wording.