Real time navigation using event cameras

Explore the new possibilities opened by event cameras for robot navigation and SLAM
Description of the Project: 

Event cameras, such as the Dynamic Vision Sensor DVS), are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. They offer significant advantages over standard cameras, namely a very high dynamic range, no motion blur, and a latency in the order of microseconds. However, because the output is composed of a sequence of asynchronous events rather than actual intensity images, traditional vision algorithms cannot be applied, so that new algorithms that exploit the high temporal resolution and the asynchronous nature of the sensor are required.

We have recently acquired such a camera and the project will explore this new device and research how they can be used for robotics and especially navigation.  As this is a very nascent field, we expect that the student will discover new applications as he/she goes along and this should also become part of the investigation.

Resources required: 
High power computer, potentially second event camera for stereo vision work
Project number: 
100016
First Supervisor: 
University: 
Heriot-Watt University
First supervisor university: 
Heriot-Watt University
Essential skills and knowledge: 
Computer vision and machine learning
Desirable skills and knowledge: 
Strong mathematical background
References: 

[1] H. Rebecq, D. Gehrig, D. Scaramuzza, ESIM: an Open Event Camera Simulator Conference on Robot Learning (CoRL), Zurich, 2018.

[2] D. Gehrig, H. Rebecq, G. Gallego, D. Scaramuzza, Asynchronous, Photometric Feature Tracking using Events and Frames, European Conference on Computer Vision (ECCV), Munich, 2018.

[3] Y. Zhou, G. Gallego, H. Rebecq, L. Kneip, H. Li, D. Scaramuzza, Semi-Dense 3D Reconstruction with a Stereo Event Camera, European Conference on Computer Vision (ECCV), Munich, 2018.