Many years ago, roboticists realised that as you morph an abstract robot into a human you generate a peak of unease - the "uncanny valley principle" - that makes people feel uncomfortable when a robot looks realistic but not realistic enough. Some say it's because they remind us of a corpse. However, research has shown that if you manipulate the robotic images so that they are more attractive, you can bypass this feeling of unease.
To create a robot we are more likely to accept, life-like expressions are vital. That's why Nicole Lazzeri at the University of Pisa, Italy, and her colleagues have designed a "Hybrid Engine for Facial Expressions Synthesis" (HEFES) - a facial animation engine that gives realistic expressions to a humanoid robot called FACE.
FACE's appearance is modelled on one of the team's wives. "It's really realistic," says Lazzeri, who presented the work at BioRob in Rome last month. See for yourself in the video above.
Nope, despite being able to mimic human emotions on her face, we have not crossed the uncanny valley. Link -via DiscoBlog
This fits our high demand for robotic drag queen / Cher impersonators.
The theremin soundtrack didn't help pull me out of the heebie-jeebies.
Where they fall apart for me is really the eyes -- they shouldn't remain stationary long and should fix on a specific object when the head swivels.
Still, neat.
Wonder though how much variation there is in the general public for the width of the uncanny valley.