IPAB Workshop

There will be an IPAB Workshop on Thursday the 22nd February at 12:45 - 14:00 in conference room 4.31/33. Evripidis Gkanias, Daniel Angelov and Raluca Scona will be speaking.  Pastries will be available

Speaker: Evripidis Gkanias

Title: Insect inspired design of a polarisation compass

Abstract: Desert ants are one of the best navigators we know in nature, being able to navigate mainly by utilising their visual cues. A big part of their success on this task is based on their polarisation compass, which transforms the daylight in a global orientation landmark. Having this in mind, we came up with a design of a sensor that is inspired by the Dorsal Rim Area of Cataglyphis bicolor, and a computational model that gives direction information with less than 1 degree error.

In this talk, Evripidis will describe the design of our sensor and its functionality.

Speaker: Daniel Angelov

Title: Saliency as the ability to learn

Abstract: Modern methods for evaluating saliency of different dimensions of a state for use in a neural network range from propagating gradients back through the network (using local approximations with a simple model, e.g. linear regression) to showing similar other states.  these methods do not directly tackle the need to explain the behaviour of the system. We present a method for jointly learning different salient subsets of states, and an approach to solving decision making tasks using such dimensionality reduced states. The work is performed in a reinforcement learning navigation environment (developed as part of an ongoing DARPA XAI project). We argue that enforcing sparsity of the state space retains explainable features that are strictly necessary for task completion. So, we discard states that may sway the network response in irrelevant ways.

Speaker: Raluca Scona

Title: Background Reconstruction for Dense RGB-D SLAM in Dynamic Environments

Abstract: Dynamic environments are challenging for visual SLAM as moving objects can impair camera pose tracking and cause corruptions to be integrated into the map. We propose a method for dense RGB-D SLAM in dynamic environments based on a strategy of static background reconstruction. While most methods employ implicit robust penalizers or outlier filtering techniques in order to handle moving objects, our approach is to simultaneously estimate the camera motion as well as a probabilistic static/dynamic segmentation of the current RGB-D image pair. This segmentation is then used for weighted dense RGB-D fusion to estimate a 3D model of only the static parts of the environment. By leveraging the 3D model for frame-to-model alignment, as well as static/dynamic segmentation, camera motion estimation has reduced overall drift --- as well as being more robust to the presence of dynamics in the scene. We compare the proposed method to related state-of-the-art approaches using both static !

 and dynamic sequences. The proposed method achieves similar performance in static environments and improved accuracy and robustness in dynamic scenes.

Date: 
Thursday, 22 February, 2018 - 12:45 to Friday, 23 February, 2018 - 13:45
Speaker: 
Evripidis Gkanias, Daniel Angelov & Raluca Scona
Affiliation: 
University of Edinburgh
Location: 
Informatics Forum 4.31/4.33, University of Edinburgh