Most of the people always think, the robots are only the machines that didn’t have any feeling. It’s definitely impossible for the scientists to apply emotion on the robots.
Well, please don’t be so sure about your thought yet, as the scientists at Georgia Tech had decided to test our ability to interpret a robot’s emotion. The research group discovered that older adults showed some unexpected differences in the way they read a robot’s face from the way younger adult did.
Jenay Beer, a graduate student in Georgia Tech’s School of Psychology described that the home-based assistive robots have the potential to help older adults, as they can be used to keep the older adults independent longer. As a result, it reduces healthcare needs and provides everyday assistance to the elders.
Based on the previous research, the robot found out that older adults are less accurate in recognizing anger, fear and happiness. Furthermore, the older adults have problem recognizing the happy robot compared with their success in recognizing happy people.
Another interesting fact about the experiment was the researchers discovered that neither the young nor old could easily distinguish the emotion disgust on the virtual iCat. It might be due to the difficulty in programming a robot to show the emotion! [sciencedaily.com]