Human-Machine Collaboration for Efficient Spatio-Temporal Biodiversity Monitoring

Efficient estimation of changes in spatio-temporal distributions of wildlife via active data collection
Description of the Project: 

There is a critical need for robust and accurate tools to scale up biodiversity monitoring and to manage the impact of anthropogenic change. For example, the monitoring of individual species that are particularly sensitive to habitat conversion and climate change can act as an important indicator of ecosystem health. Existing approaches for collecting data on individual species in the wild have traditionally been based on manual surveys performed by human experts. However, this paradigm does not scale if we wish to monitor hundreds of thousands of distinct species across vast geographic and temporal ranges.

Recent advances in artificial intelligence and robotics are resulting in next-generation systems that can be used to help efficiently capture data in the wild (e.g. from audio and video sensors) along with machine learning based algorithms for automatic data processing and understanding. Each method for data collection (e.g. automated vehicles, drones, static cameras, citizen scientists, etc.) has its own inherent strengths, weaknesses, and suitability for a given environment and species of interest. In addition, given finite resources, it may be necessary to trade off cost against performance depending on the nature of the specific requirements.

The goal of this project is to leverage the complementary strengths of humans and machines to develop probabilistic models and algorithms for efficient data collection across space and time. Human operators can help deploy robotic systems, place remote static sensors in the wild, and perform data collection themselves (e.g. with cameras). Taking inspiration from machine learning techniques such as active learning the aim is to adaptively estimate (1) where to sample and (2) what method to sample with, to obtain as complete as possible estimate of the true underlying distribution with significantly less sampling effort.

 

 

Project number: 
230021
First Supervisor: 
University: 
University of Edinburgh
First supervisor university: 
University of Edinburgh
Essential skills and knowledge: 
Knowledge of machine learning, computer vision, and probabilistic reasoning. Programming e.g. Python and deep learning frameworks e.g. PyTorch or TensorFlow.
References: 

[1] O. Mac Aodha, E. Cole, P. Perona, Presence-Only Geographical Priors for Fine-Grained Image Classification, ICCV 2019

[2] G. Van Horn, O. Mac Aodha, et al., The iNaturalist Species Classification and Detection Dataset, CVPR 2018

[3] O. Mac Aodha, et al., Bat Detective - Deep Learning Tools for Bat Acoustic Signal Detection, PLOS Computational Biology 2018

[4] O. Mac Aodha, S. Su, Y. Chen, P. Perona, Y. Yue, Teaching Categories to Human Learners with Visual Explanations, CVPR 2018