If you smile at them, “EVA” answers the friendly expression: Researchers have developed a robot that can capture and imitate the facial expressions of nearby people using artificial intelligence and advanced fine motor skills. This concept is another step towards the goal of making “technical beings” appear more human in order to make interacting with them more enjoyable.
Joy, friendliness, curiosity …: many emotions and messages are reflected in people’s complex facial expressions – facial expressions are an important part of our nonverbal communication. If she is missing, the faces seem obnoxious and insecure to us. Up until now, this has largely been the case with robots’ faces – their often established replicas barely make them appear in person. Scientists at the Creative Machines Lab at Columbia University in New York say that against a background that these tech objects are increasingly becoming contacts of people in different regions, improvement is needed. They have now presented their success in this field at the IEEE 2021 ICRA International Conference on Robotics and Automation in Xi’an, China.
“The idea for EVA arose a few years ago when my students and I noticed that we were feeling uncomfortable because robots were staring at us in the lab,” said the head of the research group, Hod Lipson. He also noticed the need for more human features in robotic systems when making notes at grocery stores, where employees gave name cards to robots to refill or put caps on them. “People seemed to want to humanize their technical colleagues in this way. That gave us the idea to build a robot with a highly expressive and responsive human face,” says Lipson.
Fine motor skills and artificial intelligence
As he and his colleagues report, creating a convincing robot face that approaches the human model has turned out to be a very challenging task. The first hurdle was developing physical machines with very fine motor skills. Because our facial expressions are based on a complex interaction of more than 42 small muscles that connect to the skin and bones of the human face at different points. “It was also a particular challenge to design a system that is proportional to the size of a human head but at the same time so powerful that it can generate a wide range of facial expressions,” says co-author Zanwar Faraj.
Using components from a 3D printer, advanced systems of pull cables and micro motors, developers have finally succeeded in giving a mock face, covered in a plastic film, expressions increasingly similar to that of a human face. “One day I found myself smiling reflexively when EVA gave me the right phrase,” says Lipson. The researchers report that the robot’s face can now transmit the six basic emotions: anger, disgust, fear, joy, sadness, and surprise, as well as a number of more subtle emotions.
After they were satisfied with the EVA mechanics, they turned to their second goal: to program the artificial intelligence that controls EVA face movements. The robot’s face should be able to read and then reflect the facial expressions of nearby human faces. The EVA “brain” is equipped with so-called deep learning neural networks. As the researchers explain, this system had to guarantee two abilities: First, the EVA must be able to learn to use its complex system of mechanical muscles to create a specific facial expression. Second, it needed the ability to automatically capture a human facial expression in order to reflect it.
EVA learns to reverse facial expressions
To teach the system how your face looks and interacts with it, the scientists photographed EVAs for hours as it generated a series of random facial expressions. The system’s internal neural network then learned to combine muscle movements with image data of one’s face, just like a person observing himself in a mirror. This is how Eva felt about how her face would work. Then I was able to learn through artificial intelligence to compare the selfie with the recordings of human faces that were recorded by a video camera. So Eva finally developed the ability to understand and interact with human facial expressions by imitating people’s facial expressions.
To some, this might sound more bizarre than a rigid facial expression – but it might just be a matter of getting used to it. “Our brains seem to respond well to robots that have some kind of distinct physical presence,” says Leibson. The researchers also indicate that EVA is an experimental regime to date and that performance is still far from the complex way in which people use facial expressions to communicate with each other. However, they are convinced that technologies such as these may one day prove helpful in enhancing the pleasant feeling of interacting with robotic systems. “More and more ways robots are woven into our lives,” said co-author Boyuan Chen. “So the importance of building trust between humans and machines is becoming more and more important.”
Source: Columbia University’s School of Engineering and Applied Sciences, presentation at IEEE 2021 ICRA International Conference on Robotics and Automation