Multimodal data fusion for mobile robots in USAR environment

27 August 2014
Begin time: 

Urban search and rescue (USAR) missions for mobile robots require reliable state estimation systems resilient to conditions given by the dynamically changing environment. We design and evaluate a data fusion system for localization of a mobile skid-steer robot intended for USAR missions. We exploit a rich sensor suite including both proprioceptive (inertial measurement unit and tracks odometry) and exteroceptive sensors (omnidirectional camera and rotating laser rangefinder). To cope with the specificities of each sensing modality (such as significantly differing sampling frequencies), we introduce a novel fusion scheme based on an extended Kalman filter for six degree of freedom orientation and position estimation. We demonstrate the performance on field tests of more than 4.4 km driven under standard USAR conditions. Part of our datasets include ground truth positioning, indoor with a Vicon motion capture system and outdoor with a Leica theodolite tracker. The overall median accuracy of localization achieved by combining all four modalities was 1.2% and 1.4% of the total distance traveled for indoor and outdoor environments, respectively. To identify the true limits of the proposed data fusion, we propose and employ a novel experimental evaluation procedure based on failure case scenarios. In this way, we address the common issues such as slippage, reduced camera field of view, and limited laser rangefinder range, together with moving obstacles spoiling the metric map. We believe such a characterization of the failure cases is a first step towards identifying the behavior of state estimation under such conditions. We release all our datasets to the robotics community for possible benchmarking.

The presented topic covers results of joint work of the Center for Machine Perception at CTU in Prague and the Autonomous Systems Lab at ETHZ. The work was coordinated by M. Reinstein and supported by the EC project FP7-ICT-NIFTi – Natural Human-Robot Cooperation in Dynamic Environments ( and the EC project FP7-ICT TRADR – Long-Term Human-Robot Teaming for Robot Assisted Disaster Response (