Technology.am (July 9, 2009) — The University of California, San Diego researchers used machine learning to “empower” their robot to learn to make realistic facial expressions. As far as we know, no other research group has used machine learning to teach a robot to make realistic facial expressions.
A hyper-realistic Einstein robot at the UC San Diego has learned to smile and make facial expressions through a process of self-guided learning. This Einstein robot head has about 30 facial muscles, each moved by a tiny servo motor connected to the muscle by a string.
The faces of robots are increasingly realistic and the number of artificial muscles that controls them is rising. In light of this trend, UC San Diego researchers are studying the face and head of their robotic Einstein in order to find ways to automate the process of teaching robots to make lifelike facial expressions.
To begin the learning process, the UC San Diego researchers directed the Einstein robot head to twist and turn its face in all directions, a process called “body babbling.” During this period the robot could see itself on a mirror and analyze its own expression using facial expression detection software created at UC San Diego called CERT (Computer Expression Recognition Toolbox). This provided the data necessary for machine learning algorithms to learn a mapping between facial expressions and the movements of the muscle motors.
Once the robot learned the relationship between facial expressions and the muscle movements required to make them, the robot learned to make facial expressions it had never encountered.