Multi-Sensor Data Fusion and Machine Learning for Adaptive Autonomous Systems
- Mortlock, Trier
- Advisor(s): Al Faruque, Mohammad A
Abstract
Autonomous systems are deployed widely across society and have achieved incredible productivity, safety, and security advances. Autonomous systems couple together capabilities in computation, sensing, and autonomy to execute tasks without human intervention. These systems operate in the cyber and physical domains and rely on sensors to collect data about their surrounding environments to influence their decision-making algorithms. Sensor fusion involves combining data from different sensors to reduce sensing uncertainties and better understand the environment. Advances in computing platforms and the growing availability of data have ushered in a wave of machined learning-based sensor fusion methods that can extend fusion capabilities beyond the current understanding of systems present in mathematical models. Machine learning models can model complex relationships between sensor data by iteratively learning over past data examples and have significantly improved autonomous system performance. However, autonomous systems are increasingly tasked with operating in dynamic settings that require high levels of adaptability, and how best to employ adaptive sensor fusion strategies is a challenging question. Furthermore, there is a lack of sensor fusion solutions that enable intelligent decision-making, are robust to changes in sensing environments, and can execute algorithms efficiently given resource constraints, all in a safe and secure manner.
This dissertation examines how, where, why, and when to perform multi-sensor data fusion for autonomous systems operating in dynamic environments. A novel sensor fusion architecture that improves autonomous systems' adaptation capabilities is introduced. The proposed fusion architecture dynamically adapts both how and when fusion is performed by using a machine learning model that learns to identify the optimal sensor fusion strategy for the given sensing context.Use cases across different types of autonomous systems are examined, highlighting the ability of the proposed fusion approach to generalize across various systems and autonomy tasks. Results from three of these use cases are presented: (i) robust and efficient perception for autonomous vehicles operating in sensing-challenged environments; (ii) improved situational understanding and planning for robotics; (iii) secure state estimation and control for autonomous power grid systems. Additionally, this dissertation proposes a way to measure the diversity of multimodal data in deep learning ensembles and examines the impact of how sensing diversity affects adaptation strategies across different autonomous tasks. Overall, the methodologies presented in this dissertation can be applied broadly across engineering disciplines and can enhance the development of future technologies that require adaptive sensing capabilities.