Enhancing Human Understanding of a Mobile Robot’s State and Actions using Expressive Lights
August 2016 • Conference Paper
Kim Baraka (Carnegie Mellon University), Stephanie Rosenthal, Manuela Veloso (Carnegie Mellon University)
In this work, we present an online study to evaluate the effect of robot communication through expressive lights on people's understanding of the robot's state and actions.
This conference paper appears in the 25th IEEE International Symposium on Robot and Human Interactive Communication (August 2016).
In order to be successfully integrated into human-populated environments, mobile robots need to express relevant information about their state to the outside world. In particular, animated lights are a promising way to express hidden robot state information such that it is visible at a distance. In this work, we present an online study to evaluate the effect of robot communication through expressive lights on people's understanding of the robot's state and actions. In our study, we use the CoBot mobile service robot with our light interface, designed to express relevant robot information to humans. We evaluate three designed light animations on three corresponding scenarios for each, for a total of nine scenarios. Our results suggest that expressive lights can play a significant role in helping people accurately hypothesize about a mobile robot's state and actions from afar when minimal contextual clues are present. We conclude that lights could be generally used as an effective non-verbal communication modality for mobile robots in the absence of, or as a complement to, other modalities.