AsianScientist (July 14, 2017) – By Christopher Lum –
“Do you know Castle In The Sky?”
“Yes. It’s a favorite among Studio Ghibli films.”
“Ok, when was the movie released?”
“1986, I think. Pazu helps Princess Sheeta at the risk of his life during the whole story. It’s nice.”
Look the other way, and this might just seem like a pretty average conversation between friends on trivia night—except it’s not.
We are in a room in Japan’s Waseda University where a researcher is having a chat with SCHEMA, a plastic-bodied humanoid robot that has all the charm of WALL-E and the encyclopedic knowledge of a true movie buff. This is the Perceptual Computing Laboratory, where researchers led by Professor Tetsunori Kobayashi are busy working on human-computer interactions which promise to create friendly robots much like Baymax of the big screen.
The era of emotional robots
The field of human-computer interactions has made considerable headway since its infancy in the 1980s. Now, it’s being used to crack the next frontier in artificial intelligence: empathy. The first rung of the empathy ladder is motor mimicry and emotional contagion, which is what SCHEMA attempts to simulate through a sophisticated algorithm that allows it to pick the most natural response based on datasets.
What’s even more remarkable are the facial recognition systems that are being developed to help such robots read people. Psychologist Dr. Paul Ekman certainly never imagined that his research would be used as a primer on human emotion for robots. Being able to know what other people are feeling is something that we take for granted, but it is a considerable challenge for those building the robot nannies and caretakers of the future.
The lack of empathy in robots is an important challenge to overcome, argues Richard Yonck, executive director of Intelligent Future Consulting. “Emotions will be critical in making machine intelligence more compatible with our own,” he said. And looking at the burgeoning ability of machine learning to process copious amounts of data, it is easy to understand why we might need to temper logical analysis with empathy.
Other challenges on the horizon include the potential backlash against the collection of feelings, en masse. While we may be increasingly comfortable with entrusting our credit card numbers, addresses and all manner of personal data to large tech behemoths, will we agree to our emotions being tracked and recorded for purposes we know little about?
Already, the idea of using machine learning and big data to tame society’s ills has gained popularity in the media, with movies like Psycho-Pass, Person of Interest and Minority Report to make us ponder the consequences of being able to prevent crime ahead of time. And that’s a real possibility given that we already have so many pieces of the puzzle already. All we need now to estimate an individual’s propensity for crime are fluctuations and trends in moods, carefully logged and monitored constantly.
But dystopian premonitions aside, the advantages of solving the empathy conundrum are considerable. The obvious application is in the realm of customer service, where robots are already being deployed to handle the woes of the paying customer.
Hanson Robotics’ Sophia has made its rounds on the internet, with its ability to emote and converse naturally being lauded as a sign of things to come. The possible applications of empathy online are certainly far-reaching, with the development of software that allows robots to read people’s emotional state from an amalgamation of datasets like inflexion, facial cues, body language and even frequency of pronouns.
Feeling the way we do
Back in the lab, SCHEMA is now in a much tougher scenario—he’s talking to three people now.
“Let’s talk about 007.”
“Have you seen Skyfall?” SCHEMA asks.
“I haven’t seen it yet.”
“I always wanted to see it.”
“We all want to see it, but none of us did!”
The three researchers laugh as SCHEMA continues participating in the conversation smoothly, pausing when the others chat among themselves, chiming in appropriately and maintaining eye contact with whoever’s currently speaking. It is all very impressive, considering the multiple approaches used in tandem.
However, it is important to recognize that there is still a long way to go for projects like SCHEMA. Part of the problem is having the robot distinguish the difference between its own emotional state and that of others. This is when motor mimicry is no longer capable of doing the job—a distinction between cognitive and emotional empathy must be made.
Cognitive empathy is the conscious desire to recognize and understand the emotional state of others. It is present in humans and apes and requires perspective taking and an expectation of likely outcomes. It allows us to regulate our emotions and respond in a manner in line with the status quo—such as to avoid laughing at an amusing situation during an otherwise serious occasion.
Emotional empathy (or affective empathy) is an older evolutionary mechanism and traces its origins to further back in the timeline of natural selection. It is present in animals like rodents and parrots and embodies a physical reaction to the emotions of others. An example would be when a rat freezes or tenses up when witnessing its peer receiving an electric shock. There is no reason that it should display the same motor behavior—after all, it’s not receiving the shock.
Suppose a caregiver robot is engaged in a conversation with an elderly woman who is recounting her life story. Suppose she mentions a bittersweet tale of heartbreak in her youth. Here, the robot must understand objectively that the person is feeling sorrow, and that itself is not. This is known as other-self discrimination—a concept that’s based around differentiating our feelings from the bodily reactions arising from emotional contagion. Only then is the robot able to respond in a manner that is closest to our own organic version of empathy, using aggregated data from social situations as a synthetic stand-in for our own experiences. And we’ve gotten rather good at collecting and analyzing large quantities of data.
The advent of machine learning, big data and artificial intelligence means that a better and more effective reference point can be established for an artificial empathy circuit. Using that, our robot caregiver is able to regulate its emotions and act accordingly, which, in this case, is to offer kind words and appear sympathetic to the grey-haired lady, just like how a human would operate.
The real impact of artificial empathy
Of course, if this all sounds too ‘cold,’ remember that what is being attempted is far more complex than simply having robots mirror the emotions of other. In his book The Science of Evil: On Empathy and the Origins of Cruelty, Professor Simon Baron-Cohen writes that “one can imagine this type of contagion happening without needing to think consciously about another’s feelings,” and that “empathy seems to be more than just this automatic mirroring.”
It is a sizable challenge that more and more countries are looking into. In Singapore, the National Research Foundation will be spending up to S$150 million on a new initiative called AI.SG that is intended to improve the island nation’s artificial intelligence capabilities. Given the shrinking labor market and aging population Singapore and other developed countries are increasingly facing, the potential benefits of leveraging AI are large.
Elderly care and palliative treatment have become more pressing matters, with associated health costs expected to rocket to ten times the current amount by 2030. This is where using artificial empathy together with other tools like machine learning and big data analysis can have a significant impact.
Robot caregivers can easily supplant their human counterparts by providing unique insights into the overall physical and mental wellbeing of their beneficiaries and offering better care—all at a reduced cost, given the high costs of training professionals.
When quizzed about the Oscar-winning movie Black Swan, SCHEMA remarks:
“It is an erotic thriller movie that expresses the elegance of ballet. The camera work is great as well as Natalie’s performance.”
“Oh… I didn’t know that,” his partner says with curiosity.
Maybe we should start getting used to talking to machines. Who knows, they might just become more human than we are, as the lines between man and machine continue to blur.
This article won second place in the Science Centre Singapore Youth Writing Prize at the Asian Scientist Writing Prize 2017.
Click here to see photos of the the prize presentation ceremony held on July 7, 2017.
Also, look out for the other winning entries to be published in a compilation coming out later this year.
Copyright: Asian Scientist Magazine; Photo: Shutterstock.
Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.