ECR Staff to lead Research Nodes as part of the UKRI Trustworthy Autonomous Systems Programme
The UK Research and Innovation (UKRI) has launched six new research projects or, “nodes” aimed at tackling challenges to the development of autonomous systems. These are part of the Trustworthy Autonomous Systems (TAS) programme which will undertake fundamental, creative and multidisciplinary research in various areas key to ensure autonomous systems can be built in a way society can trust and use.
The Edinburgh Centre for Robotics has been awarded funding for two nodes:
Professor Subramanian Ramamoorthy at the School of Informatics, University of Edinburgh will be leading the UKRI Trustworthy Autonomous Systems (TAS) Node in Governance & Regulation.
This research node aims to create new and improved methods for governing autonomous systems that reflect these emerging use cases such as automated diagnostics or for socially assistive care systems and the autopilot feature in airplanes.
The project will establish a new software engineering framework to support TAS governance, and trial them with external stakeholders in areas including mobile autonomous systems, and health and social care. Newly developed computational tools for regulators and developers will complement the new methods of governance. In particular, this will include a deeper understanding, from multiple disciplinary perspectives, of how and why autonomous systems fail. The team also aim to improve understanding of the iterative nature of design processes associated with such technologies, and recommend ways to better govern such processes.
Professor Helen Hastie from the School of Mathematical and Computer Sciences at Heriot-Watt University is leading the UKRI Trustworthy Autonomous Systems Node in Trust.
This project will explore solutions to manage trust in autonomous systems, covering scenarios that require interaction with humans. Examples include self-driving cars, autonomous wheelchairs or ‘cobots’ in the workforce. The group’s work will help design the autonomous systems of the future, ensuring they are widely used and accepted in a variety of industry-relevant applications.
Professor Hastie explains: “The challenge of managing trust between the human and the system is particularly difficult because there can be a lack of mutual understanding of the task and the environment. The new consortium will perform foundational research on how humans, robots and autonomous systems can work together by building a shared reality through human-robot interaction.
“By adopting a multidisciplinary approach, grounded in psychology and cognitive science, systems will learn situations where trust is typically lost unnecessarily, adapting this prediction for specific people and contexts. We will explore how to best establish, maintain and repair trust by incorporating the subjective view of humans towards autonomous systems, with the goal being to increase adoption and maximise their positive societal and economic benefits.
“Trust will be managed through transparent interaction, increasing the confidence of those using autonomous systems, allowing them to be adopted in scenarios never before thought possible. This might include jobs that currently endanger humans, such as pandemic-related tasks or those in hazardous environments.”