With this project we aim to understand the current knowledge on human emotion, to try and develop the first stages of a computational model of emotion that is based on innate processes to maintain a few key variables within acceptable limits based on perceptual and proprioceptive information.
(MSc) Towards a Mutual Gaze Controller for iCub: Markov Modelling of Human gaze. Supervised by Frank Broz
Why is gaze so important? Why do we want robots to have more human like gaze? And how can we achieve this?
Psychology has discovered that gaze has an important part in human social interaction, and importance that extends to all primates. When robots take on human shape and roles the expectation that they will follow social convention increases. We present an overview of the literature on efforts to make the behaviour, especially gaze behaviour, of robots more human like during social interactions, mostly one-on-one..
This project uses data gathered previously of face to face conversations (Broz et al 2012) and the probabilistic tool of Markov chains to design a gaze controller that we hope exhibits more human-like gaze behaviour. We lay out the methods and techniques we use to go from data to controller design and a possible future implementation on the iCub robot. From clustering and labelling the clusters to the use of state transition counting for gathering the controller parameters, as well as how we create time histograms from the data..
Our results are mainly in furthering the understanding of the data and insights to from it. We have found there is plenty of variability between people. We have found there are quite variable and perceptible offsets in the gaze date in relation to the image extracted features of the original data. We confirmed our expectation that when looking at the partner’s face the focus is mostly the eyes and mouth. We found that, against some of our initial expectations, the time histograms are very close to the exponential drop-off expected of a Markov process. We found support for the hypothesis that people look more at the mouth when the partner is talking and we found that the extra attention comes mostly from the time spent looking away from the face. We put forward, based on the analysis of the controller transition counts, the hypothesis that the mouth may be used in predicting when a person needs to give away their speaking turn..