To comprehensively perceive the surrounding environment, current advanced driver-assistance systems are usually equipped with a suite of complementary sensors, including radar, lidar and optical camera. Moreover, it is crucial for such a system to function robustly in a range of adverse weather conditions (e.g. fog, rain and snow). To this end, data from these disparate
sensors needs to be fused in a more effective manner to fully exploit their complementarity. We expect the success of this project to help progress the level of driving automation, particularly in the aforementioned complicated scenes.