A social gaze controller for naturalistic human-robot interaction

To learn from human data and evaluate a gaze controller capable of performing social gaze behaviours (mutual gaze and joint attention) for face-to-face interaction with humanoid robots.
Description of the Project: 

The purpose of this project is to investigate the role of social gaze behaviours such as mutual gaze and joint attention in face-to-face human-robot interaction. The project will take a data-driven approach to controller design, collecting human-human gaze interaction data (as in [1]) and using machine learning techniques to learn controllers for HRI from this data, particularly for the control of humanoid robots with articulated eyes which are capable of more subtle and realistic gaze behaviour than robots with simpler facial designs. These controllers will be evaluated through interaction with human users.

Socially appropriate gaze behaviour is necessary for safe and effective interaction with humanoid robots. Joint attention and mutual gaze cues reduce ambiguity during face-to-face interaction, which is important for human-robot collaboration, especially when working in complex environments. Also, appropriate gaze behaviour can improve the transparency of an agent’s actions and may improve trust in HRI systems.

The major research questions addressed by this project include:

-How do individual differences influence gaze behaviour during interaction [1] and how can a controller adapt to these differences [2]

-How does the performance of social gaze behaviours by a robot influence people’s impressions of a robot as a social actor [3]

Other topics for investigation include: the importance of social gaze in supporting long-term or repeated interactions with robots, the impact of physical versus virtual embodiment on people’s responses to an agent capable of social gaze behaviours, and the degree to which gaze control for face-to-face HRI can or cannot be decoupled from the semantic content of speech. 

Resources required: 
ICub robot, possibly other robot such as EMYS head.
Project number: 
200004
First Supervisor: 
University: 
Heriot-Watt University
First supervisor university: 
Heriot-Watt University
Essential skills and knowledge: 
Degree in CS or MEng in Robotics or equivalent.
Desirable skills and knowledge: 
AI and machine learning knowledge, experience conducting human subject experiments
References: 

1. Mutual gaze, personality, and familiarity: Dual eye-tracking during conversation. F Broz, H Lehmann, CL Nehaniv, K Dautenhahn. IEEE Conference on Human-Robot Communication (RO-MAN), 2012.

2. A gaze controller for coordinating mutual gaze during conversational turn-taking in human-robot interaction. F Broz, H Lehmann. Proceedings of the Tenth Annual ACM/IEEE Conference on Human-Robot Interaction (HRI) Extended Abstracts, 2015.

3. Naturalistic Conversational Gaze Control for Humanoid Robots-A First Step. H Lehmann, I Keller, R Ahmadzadeh, F Broz. International Conference on Social Robotics (ICSR), 2017.