Federated Edge Computing for Resource-Constrained Robots

Develop an edge computing framework that distributes storage, compute and networking resources among the cloud, gateway and robotics on the edge in a federated manner.
Description of the Project: 

Background: Today’s robotic systems are increasingly employing computationally expensive models such as deep neural networks (DNNs) for tasks like localization, perception, planning and navigation. However, resource-constrained robots such as drones and insect robots, often have limited on-board computation resources or power budgets to supply the operation of the most accurate, state-of-the-art DNN models. To address the computation bottleneck, cloud and fog robotics recently emerged as a new computation paradigm that utilizes both on-robot and remote resources. With the advent of cloud robotics platforms provided by Google/Amazon and the incoming era of 5G/6G, it is an exciting time to consider this new computation paradigm for resource-constrained but network-enabled robots.  

 

Challenges:  Despite the great promise, cloud and fog robotics come with a series of challenges ranging from the network architecture design to the development of distributed inference and training algorithms. For example, offloading high-definition videos or high data-rate sensor measurements such as LIDAR can severely congest wireless networks, add to latency, and place a large burden on cloud compute resources. The optimal distributed inference policy between the robots and the remote resources can be too complicated to implement, esp. when a fleet of robots are present. Last but not least, it remains largely unknown how to effectively realize the promising federated learning or distributed model training with heterogeneous robots on the edge. 

 

Goals:  At a high-level, this project aims to explore the edge computing framework that distributes storage, compute and networking resources between the remote and the edge robotics in a federated manner. By focusing on the application scenarios in smart manufacturing and smart farms, the specific objectives of this project are as follows: 

  1. Profile and determine the effective hierarchical network architecture between cloud layer, gateway layer and edge robotics layer under the context of 5G/6G.  
  1. Design distributed inference algorithms for fog robotics that are adaptive to different DNN models in operation for localization, perception, planning and navigation.  
  1. Design distributed model training (federated learning) methods for fog robotics that are scalable to multi-agent connected environments. 
  1. Design failsafe strategies on the side of robots in response to the potential network crash due to bandwidth throttling or malicious attack.  

 

The usefulness of the proposed system will be evaluated in terms of the real-time performances for a variety of DNNs models for tasks like localization, perception, planning and navigation.  Evaluations will consist of several rounds of field tests.

Resources required: 
Amazon or Google Cloud Robotic Platform. A GPU workstation to simulate the gateway layer. This is already in the Robotarium and the supervisor’s lab.
Project number: 
340001
First Supervisor: 
University: 
University of Edinburgh
First supervisor university: 
University of Edinburgh
Essential skills and knowledge: 
Skills: Linux, C, C++, Python, Tensorflow Knowledge: Computer Systems, Computer Networks, Deep Neural Networks
Desirable skills and knowledge: 
Strong background in discrete and convex optimization Research experience in resource allocation or similar projects. Skilled in ROS development.
References: 

[1] Song, Dezhen, et al. "Networked-, cloud-and fog-robotics." Springer (2019). 

[2] Kehoe, Ben, et al. "A survey of research on cloud robotics and automation." IEEE Transactions on automation science and engineering 12.2 (2015): 398-409. 

[3] Chinchali, Sandeep, et al. "Network offloading policies for cloud robotics: a learning-based approach." RSS (2019). 

[4] Tian, Nan, et al. "A fog robotic system for dynamic visual servoing." 2019 International Conference on Robotics and Automation (ICRA). IEEE, 2019.