Learning to manipulate unknown objects
There are numerous spectacular examples of biological organisms performing manipulation tasks under quite severe constraints. For instance, birds build nests using bits and pieces of objects they may know little about. Likewise insects move food around with only very partial ego-centric views of large objects. Can robots learn from this?
This project is focussed on gaining an understanding of these phenomena in the language of closed-loop control and planning strategies. The methodology is to identify abstracted tasks from the biological motivation, devise robotic strategies for implementing them and through analysis of the resulting closed-loop strategies and comparative studies between nature and the machine obtain learnings that would inform subsequent engineering design. We believe that a successful outcome could enable significant new capabilities, including for surgical robots manipulating soft tissue deep inside the human body and various forms of field robots that must cope with an unknown dynamic environment.
Specific work within this project will include:
- Implementation of bio-inspired mechanisms for manipulation (e.g., new designs for grippers and end effectors with embedded sensors)
- Control and dynamical systems modelling of natural behaviours, such as in the case of visuo-tactile object manipulation, with a focus on exploration and adaptation mechanisms
- Robotic implementation of these strategies in laboratory experiments involving a variety of objects, with the end goal being soft tissue manipulation
 Tuthill, J. C., & Wilson, R. I. (2016). Mechanosensation and adaptive motor control in insects. Current Biology, 26(20), R1022-R1038.
 Fleer, S., Moringen, A., Klatzky, R. L., & Ritter, H. (2020). Learning efficient haptic shape exploration with a rigid tactile sensor array. PloS one, 15(1), e0226880.
 Tian, S., Ebert, F., Jayaraman, D., Mudigonda, M., Finn, C., Calandra, R., & Levine, S. (2019, May). Manipulation by feel: Touch-based control with deep predictive models. In 2019 International Conference on Robotics and Automation (ICRA) (pp. 818-824). IEEE.