Investigated how having robots automatically explain their behavior using natural language would improve users' trust
Project developed a framework to support scalable production and customization of 3D models
Why did the robot do that?
Building Tools to Support Data Sampling and Visualization
In this work, we investigate spatial features and perspectives in human spatial references and compare word usage when instructing robots vs. instructing other humans.
In this work, we present an online study to evaluate the effect of robot communication through expressive lights on people's understanding of the robot's state and actions.
With a growing number of robots performing autonomously without human intervention, it is difficult to understand what the robots experience along their routes during execution without looking at execution logs. Rather than looking through logs, our goal
In this paper, we propose the concept of coordination between CoBot and the Parrot ARDrone 2.0 to perform service-based object search tasks, in which CoBot localizes and navigates to the general search areas carrying the ARDrone and the ARDrone searches l
In this work, we address the generation of narrations of autonomous mobile robot navigation experiences.