Helping Robots Put Their Best Face Forward (VIDEO)

A research group at Osaka University, Japan, has found a way to improve the expressiveness of robots’ faces.

AsianScientist (Nov. 26, 2018) – A robot named Affetto is helping researchers increase the diversity and accuracy of facial expressions in machines. They report their findings in Frontiers in Robotics and AI.

Robots are unable to mimic the huge range and asymmetry of natural human facial movements. The materials used to make the ‘skin’ of robots, as well as the intricate engineering and mathematics that drive robotic motion, need to be improved if more expressive robots are to be realized.

In the present study, a trio of researchers at Osaka University, Japan, has developed a method to make their robot express greater ranges of emotion on its face.

“Surface deformations are a key issue in controlling android faces,” said study co-author Professor Minoru Asada of Osaka University. “Movements of their soft facial skin create instability, and this is a big hardware problem we grapple with. We sought a better way to measure and control it.”

The researchers investigated 116 different facial points on Affetto to measure its movements and expressions in three dimensions. Facial points were underpinned by so-called deformation units. Each unit comprises a set of mechanisms that create a distinctive facial contortion, such as lowering or raising of part of a lip or eyelid.

Measurements from deformation units were then subjected to a mathematical model to quantify their surface motion patterns. The researchers were able to use this system to adjust each deformation unit for precise control of Affetto’s facial surface motions.

“Android robot faces have persisted in being a black box problem: they have been implemented but have only been judged in vague and general terms,” said study first author Assistant Professor Hisashi Ishihara of Osaka University. “Our precise findings will let us effectively control android facial movements to introduce more nuanced expressions, such as smiling and frowning.”




The article can be found at: Ishihara et al. (2018) Identification and Evaluation of the Face System of a Child Android Robot Affetto for Surface Motion Design.

———

Source: Osaka University.
Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.

Asian Scientist Magazine is an award-winning science and technology magazine that highlights R&D news stories from Asia to a global audience. The magazine is published by Singapore-headquartered Wildtype Media Group.

Related Stories from Asian Scientist