April 29th, 2015
Nae-Eung Lee and colleagues note that one way to make interactions between people and robots more intuitive would be to endow machines with the ability to read their users' emotions and respond with a computer version of empathy. Most current efforts toward this goal analyze a person's feelings using visual sensors that can tell a smile from a frown, for example. But these systems are expensive, highly complex and don't pick up on subtle eye movements, which are important in human expression. Lee's team wanted to make simple, low-cost sensors to detect facial movements, including slight changes in gaze.
The researchers created a stretchable and transparent sensor by layering a carbon nanotube film on two different kinds of electrically conductive elastomers. They found it could tell whether subjects were laughing or crying and where they were looking. In addition to applications in robotics, the sensors could be used to monitor heartbeats, breathing, dysphagia (difficulty swallowing) and other health-related cues.
Image: From joy to sadness, facial expressions could soon be decipherable to robots.
Image credit: mtr/iStock/Thinkstock