Explainable AI and Autonomy for the Maritime Domain

Investigate techniques to enable the human operator to collaborate and team effectively with autonomous systems with varying autonomy and communication levels in the Marine domain
Description of the Project: 

**Note: Project availability subject to collaboration agreement being signed**

SeeByte is sponsoring a PhD in the area of Human-Machine Interaction for safe, innovative and dynamic use of marine autonomous systems. These systems have varying autonomy and communication ability ranging from tethered remote controlled Remotely Operated Vehicles (ROV)s that have low autonomy, to completely autonomous systems that can go deep underwater but with limited comms, as well as autonomous surface and air vehicles with continuous communications.

This variability is a challenge in terms of the human operators’ situation awareness, trust and mental models of the various vehicles, and if and when the human is required to take over control or abort the mission.  There is an increased need for unmanned and manned system to be able to co-operate and effectively team,  requiring fluid, adaptive and continuous interactions between the operators and robots.  This is a significant departure from current state-of-the-art Human-Machine-Interaction in the Maritime Robotics domain, where the operators typically pre-plans a fixed mission and is then consigned to the role of an observer.

Explainability of Autonomy and AI is a key factor in maintaining trust and facilitating adoption and this PhD will focus on natural language explanations with respect to SeeByte’s autonomy framework, that is deployed on maritime robotics worldwide and was the first commercial system to demonstrate multi-vehicle collaboration in the maritime domain.

Research questions include:

  1. What information should the vehicle convey and when depending on varying communication links?
  2. How can the systems explain their behaviour and what they can and can’t do in an easy in natural language to facilitate operations and training,
  3. How can we adapt explanations to the current context and user’s mental model?
  4. How should an operator interact effectively with multiple assets and what information is required before an operator can “delegate authority” to an autonomous system?

 

Resources required: 
Free software licences including SeeTrack4 (Command and Control software for unmanned systems), Neptune (Autonomy Framework) and access to Subject Matter Experts. SeeByte will provide a work space in the office and access to data from international end-users of SeeByte’s autonomy software.
Project number: 
600002
First Supervisor: 
University: 
Heriot-Watt University
First supervisor university: 
Heriot-Watt University
Sponsor: 
Essential skills and knowledge: 
Good software programming skills. An interest in robotics, specifically Autonomy, A.I and how humans interact with these systems.
Desirable skills and knowledge: 
ROS, NLP/Interaction, Machine learning, software knowledge of C++.
References: 
  • Francisco J. Chiyah Garcia, David A. Robb, Atanas Ivaylov Laskov, Xingkun Liu, Pedro Patron, and Helen Hastie (2018). Explainable Autonomy: A Study of Explanation Styles for Building Clear Mental Models through a Multimodal Interface. In Proceedings of the 11th International Conference of Natural Language Generation (INLG), Tilburg, The Netherlands. arXiv preprint
  • D. A. Robb, J.S. Willners, N. Valeyrie, F.J.C. Garcia, A. Laskov, X. Liu, P. Patron, H. Hastie and Y. Petillot (2018). A Natural Language Interface with Relayed Acoustic Communications for Improved Command and Control of AUVs. In Proceedings of the 2018 IEEE/OES Autonomous Underwater Vehicles Symposium, 2018. arxiv preprint
  • David A. Robb, Francisco J. Chiyah Garcia, Atanas Laskov, Xingkun Liu, Pedro Patron, and Helen Hastie. 2018. Keep Me in the Loop: Increasing Operator Situation Awareness through a Conversational Multimodal Interface. In 2018 International Conference on Multimodal Interaction (ICMI ’18), October 16–20, 2018, Boulder, CO, USA [link]
Industry placement details: 
SeeByte has offices in the centre of Edinburgh and will provide space, software and expertise in maritime robotics to the successful applicant. Relevant autonomy software and subject-matter expertise, along with access to relevant data sets, will be provided. SeeByte have a sister company based in San Diego, California, which helps support a major user of SeeByte’s autonomy systems in the U.S. Potential short term trips to San Diego may also be considered during the PhD. Operating from offices in Edinburgh, Scotland and San Diego, California, SeeByte has achieved a position of leadership in the development of smart software for underwater vehicles, sensors and systems in both the Military and Oil & Gas sectors. SeeByte provides products and services to major government and commercial clients around the world. The Edinburgh based engineering team provides world leading applied research and software products in the maritime robotics domain worldwide. Dr. Pierre Yves Mignotte would be the industry supervisor. He received a five-years engineering degree in Physics from the Ecole Nationale Supérieure de Physique et de Chinie de Paris in 2003 and a PhD in subsea image analysis from Heriot-Watt University in 2006. Dr. Mignotte joined SeeByte in 2006 to join the Research and Development group. He has developed novel target detection and tracking algorithms in video and sonar imagery. He has been involved in major R&D projects developing software for underwater vehicles autonomy. Most recently he has been SeeByte’s project manager responsible for developing, delivering and testing collaborative autonomy capabilities to customers worldwide. Dr. Mignotte has been at the forefront of the development of novel technologies for unmanned systems in the subsea domain for the last 10 years.