Autonomous robots are becoming a pervasive technology and will provide a critical asset to address some of the major societal challenges over the next decades. Unmanned aerial vehicles (UAV) for crop monitoring and spraying in precision agriculture will enable early disease detection and timely actions on crops, reducing yield loss and helping to cope with the increasing food demand. Fast and agile UAV will deliver medical supplies (e.g., vaccines) in rural areas and in underdeveloped countries, fighting the spread of diseases and providing new opportunities for the global welfare. Autonomous robots capable of dexterously moving among obstacles will be an invaluable support for search and rescue and disaster response. Inexpensive and lightweight platforms will collect a large amount of environmental data, providing new perspectives and actionable understanding of phenomena such as hurricanes and floods. A virtually endless list of applications includes infrastructure inspection, transportation, monitoring, construction, and entertainment, making robotics a multi-billion dollar market in rapid expansion.
A main challenge towards this vision is the design of robust and lightweight perception algorithms, which interpret sensor data into a coherent world representation, enabling on-board situational awareness and high-level decision-making. Perception constitutes a bottleneck in the deployment of robotics systems, and indeed most of the existing robotics applications either rely on low-level supervision by a human (e.g., robots used for bomb disposal or disaster response), or operate in structured environments (e.g., fenced areas in factory floors equipped with external markers/cameras for localization and guidance).