Autonomous Agents Modelling Other Agents

Develop and evaluate algorithms which enable autonomous agents to model the behaviours, beliefs, goals, etc. of other agents
Description of the Project: 

The design of autonomous agents which can complete tasks in complex dynamic environments is a core area of research of modern artificial intelligence. A crucial requirement in such agents is the ability to interact competently with other agents (including humans) whose behaviours, beliefs, plans, and goals may be unknown. To interact with such agents requires the ability to reason about their unknown behaviours based on their observed actions, the context in which these actions took place, and other available information. While much research has been devoted to the development of such reasoning methods, there are still many open questions. A recent survey by Albrecht and Stone [1] provides a comprehensive overview of the existing methodologies and concludes with a section on open problems. I am interested in supervising projects in this general area, especially projects addressing open problems from the survey.

Project goals are flexible and may include the development of novel algorithms and the analysis/comparison of existing methods under various settings. Potential evaluation domains include autonomous vehicles in dense traffic scenarios; competitive games (e.g. Starcraft 2, Doom); and intelligent tutor systems.

Resources required: 
High-throughput computing for simulations (which is provided through the ECDF Eddie system
Project number: 
First Supervisor: 
University of Edinburgh
First supervisor university: 
University of Edinburgh
Essential skills and knowledge: 
Strong programming skills; strong grasp of probability, statistics, calculus, etc.; ability to work independently
Desirable skills and knowledge: 
Knowledge of multi-agent systems and agent modelling

[1] Stefano Albrecht and Peter Stone (2018). Autonomous agents modelling other agents: A comprehensive survey and open problems. Artificial Intelligence 258:66-95.