Human-Robots Will Soon Communicate With People Using Nonverbal Cues
Researchers at University of British Columbia are working to program robots to interact and communication with people using human-like body language and cues. They say that this is an important step in welcoming robots into our homes.
For their study, they employed a robot named Charlie to study a simple task. They wanted to test the effectiveness of the robot’s ability to hand an object to a person. Past research has found that people have difficulty with robots as they aren’t given proper cues specifying when and how to reach out and take the object due to a lack of nonverbal cues.
“We hand things to other people multiple times a day and we do it seamlessly,” says AJung Moon, a PhD student in the Department of Mechanical Engineering. “Getting this to work between a robot and a person is really important if we want robots to be helpful in fetching us things in our homes or at work.”
In the study, Moon and her colleagues observed what people do with their heads, neck and eyes as they hand objects from one person to another.
The study then tested just over 100 subjects across three variations as they interacted with Charlie on the same hand-over task.
They found that by programming the robot to use the eye gaze nonverbal cue, that it made the handover much more fluid. They also found that people reach out and take the object sooner when the robot moved its head and looked at the location in which it intended to hand the object over, or looked to the handover location and then up at the person to establish eye contact.
“We want the robot to communicate using the cues that people already recognize,” says Moon. “This is key to interacting with a robot in a safe and friendly manner.”
The study shows just how important nonverbal communication is for smooth interaction especially in a potentially strained relationship such as that which may occur between an human-like-object and a real life human.
Moon, A. Jung; Daniel M. Troniak; Brian Gleeson; Matthew K.X.J. Pan; Minhua Zheng; Benjamin A. Blumer; Karon MacLean; and Elizabeth A. Croft. 2014. Meet Me Where I’m Gazing: How Shared Attention Gaze Affects Human-Robot Handover Timing. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction (HRI ’14). ACM, New York, NY, USA, 334-341. DOI=10.1145/2559636.2559656 http://doi.acm.org/10.1145/2559636.2559656
Learn More Starting Today!
Learn the basics of body language quickly and easily by taking one of our tailor-made video courses!
Which body language video course will you start with?