Verbal and non-verbal communication in human-aware navigation
Mobile robots are used in a variety of applications inside and outside of constrained areas. When these robots move to the real-world, they will have to navigate around people in a manner that is not only safe from a technical point of view but also makes the person feel safe. This feeling of safety is paramount for the comfort the person experiences when interacting with the robot in this way.
Many approaches to human-aware navigation have been proposed in the past (see ) but the vast majority of projects relies on the simple approach of using proxemics . This principal is translated into simple cotsmaps  and then used for navigation in the hope that staying outside of someone’s personal space will make the person feel safe and comfortable.
Thinking about everyday encounters with people inside a building, just staying away from them is not always an option. Both agents might have to negotiate the route they are going to take when passing each other in a narrow corridor. We all know the effect of this negotiation failing as we have alle done “the dance” of moving to either side in unison with the person trying to avoid each other. Since human-aware navigation normally only looks at the movement of the robot, many people have tried to use motion as cues for this negotiation (see  on examples for legible movement) or indicate the direction of movement via gaze or other non-verbal means . This project will investigate the use of verbal communication to aid the negotiation and to increase the comfort and feeling of safety of the interaction partner(s). The final outcome of this project should be a novel human-aware navigation system that on the one hand is able to communicate its intention of how to avoid someone verbally but is also able to engage in a dialogue to take the input of the human interaction partner into account when planning it’s path. Hence, the path planning system needs to be explainable and adaptive to verbal commands and it needs to be combined with a Natural Language Processing system.
 Kruse, T., Pandey, A. K., Alami, R. & Kirsch, A. (2013), ‘Human-aware robot navigation: A survey’, Robotics and Autonomous Systems 61(12), 1726–1743.
 Hall, E. T. (1969), The hidden dimension, Anchor Books New York.
 Lu, D. V., Hershberger, D. & Smart, W. D. (2014), Layered costmaps for context-sensitive navigation, in ‘IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2014), 2014’, IEEE, pp. 709–715.
 Lichtenthäler, C. & Kirsch, A. (2016), ‘Legibility of robot behavior: A literature review’.
 A. D. May, C. Dondrup and M. Hanheide, "Show me your moves! Conveying navigation intention of a mobile robot to humans," 2015 European Conference on Mobile Robots (ECMR), Lincoln, 2015, pp. 1-6