Insect-inspired reaching for a flying robot

To build a robot model of the proboscis positioning behaviour of hawkmoths.
Description of the Project: 

Visually guided reaching behaviour is important for robotics, and has been studied in humans, fixed-base robot arms and humanoid robots. As yet, autonomous flying vehicles are rarely equipped with appendages for reaching out to interact with objects in the world, and how reaching behaviour can be controlled from a robot in flight is a new field of study [1]. This project takes inspiration from the hawkmoth which can hover in front of a flower and use visual information to make precisely targetted movements to allow its long proboscis to contact the nectar [2]. Under natural conditions, as the flower and the moth encounter wind disturbance, the moth can adjust its body and proboscis maintain a stable contact for the time needed to feed from the flower. The relatively small number of neurons in the moth brain suggests they may have an efficient and effective control algorithm that could be translated to a robot task [3]. In collaboration with biologists studying the moth, this project will characterise the behaviour, devise neural algorithms for control, and test these on a flying robot platform which should be able to pinpoint and remain in contact with a small target under disturbances.

Resources required: 
UAV platforms
Project number: 
100020
First Supervisor: 
University: 
University of Edinburgh
Second Supervisor(s): 
First supervisor university: 
University of Edinburgh
Essential skills and knowledge: 
Good programming skills, robot hardware development
Desirable skills and knowledge: 
Familiarity with ROS, interest in biology.
References: 

[1] Meng, Xiangdong, Yuqing He, and Jianda Han. "Survey on aerial manipulator: System, modeling, and control." Robotica 38.7 (2020): 1288-1317. 

[2] Stöckl, Anna Lisa, and Almut Kelber. "Fuelling on the wing: Sensory ecology of hawkmoth foraging." Journal of Comparative Physiology A 205.3 (2019): 399-413. 

[3] Roth, Eatai, et al. "Integration of parallel mechanosensory and visual pathways resolved through sensory conflict." Proceedings of the National Academy of Sciences 113.45 (2016): 12832-12837.