March 3, 2025
Thesis Title: Adaptive Multi-Sensor Fusion for Robust Autonomous Perception in Unstructured Environments
When: March 18, 2025 (10:30 AM CT)
Where: Simrall 228 and Microsoft Teams (Meeting Link)
Candidate: Samantha Carley
Degree: Master of Science in Electrical & Computer Engineering
Committee Members: Dr. John Ball, Dr. Lalitha Dabbiru, Dr. Chaomin Luo, and Dr. Stanton Price
Abstract
Autonomous vehicles commonly employ multiple sensors to perceive their surroundings. Coupling of these sensors would ideally improve perception compared to using a single sensor. For an autonomous system to understand a scene intelligently, it can be equipped with object localization and classification, often performed using a visual camera. Object detection and classification can also be applied to LiDAR and infrared (IR) sensors to further enhance scene awareness of the autonomous system. Herein, sensor-level, decision-level, and feature-level fusion are explored to assess their impact on perception and mitigate sensor disagreements. Specifically, the fusing of RGB, LiDAR, and IR sensor data to improve object classification and scene awareness was investigated. Additionally, an SVM-based feature fusion method is also proposed as an alternative avenue to optimize computational efficiency of a fusion framework. Results show that multi-modal perception enhances accuracy by balancing sensor strengths and weaknesses. Experiments were conducted using a multi-sensor off-road dataset collected at the Center for Advanced Vehicular Systems (CAVS) at Mississippi State University.