Published on
Updated
Reading 1 min.
Engineers from Columbia University in New York have developed Emo, a robotic face covered in silicone and equipped with artificial intelligence allowing it to adapt the expression of its face to that of its interlocutor. Grafted tomorrow onto a real humanoid robot, it could then help interact as best as possible with humans.
Thanks to the rise of AI, robots have made enormous progress in recent months in terms of voice communication. From now on, it is on facial expressions that we need to work, in order to one day make them definitively sociable. In the laboratory, Emo is now able to make eye contact and uses two artificial intelligence models to be able to reproduce a person’s smile before they actually do it. The first predicts human facial expressions by analyzing subtle facial changes and the other generates motor commands to change the robot’s attitude accordingly. The idea is to anticipate the reaction of humans so that the robot always has the right expression and thus gains their trust.
The difficulty lies not so much in the production of these facial expressions, but in their sequence. According to the researchers working on this project, Emo can predict a future smile approximately 840 milliseconds before the person sketches it. From then on, he manages to smile simultaneously with her.
Today, Emo’s head has 26 actuators that enable a wide range of nuanced facial expressions. To achieve these interactions, the researchers also integrated high-resolution cameras into the pupil of each eye. This is how it can be trained, for hours on end, in front of videos of human facial expressions.
In absolute terms, the potential of Emo and its future variations is infinite since such a robot, ultimately capable of empathy, could be useful in fields as diverse as communication, education or even therapy.