It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
(visit the link for the full news article)
A robot has taught itself to smile, frown, and make other human facial expressions using machine learning.
To get the incredibly realistic Einstein robot to make facial expressions, researchers used to have to program each of its 31 artificial muscles individually through trial and error. Now, computer scientists from the Machine Perception Laboratory at the University of California, San Diego have used machine learning to enable the robot to learn expressions on its own.
A hyper-realistic Einstein robot at the University of California, San Diego learned to smile and make facial expressions through a process of self-guided learning. The UC San Diego researchers used machine learning to empower their robot to learn to make realistic facial expressions.
The Man-machine Integration Design and Analysis System (MIDAS) is a human performance modeling and simulation environment that facilitates the design, visualization, and computational evaluation of complex man-machine system concepts in simulated operational environments.