The full text of this item is not available at this time because the student has placed this item under an embargo for a period of time. The Libraries are not authorized to provide a copy of this work during the embargo period, even for Texas A&M users with NetID.
Auxiliary Loss Weighting for Robust Multi-Modal Sensor Fusion with Deep Neural Networks
MetadataShow full item record
The major focus of this research is on sensor fusion. Sensor fusion means to combine multiple sensory input or data from different source such that the performance is better than the best performance would be when those different data were used individually. As we know, in the world of AI, sensors are really important. Traditionally, we treat these data as they are separated. Doing so may deliver good performance when the sensors are exempt from noise and malfunction problems. However, if sensor failure appears, the performance will drop. Sensor fusion is a solution for the above situation. Recently, deep neural networks have been rigorously studied for sensor fusion applications such as autonomous driving and robot control. Among these studies, various gated neural network architectures were proposed, which have improved the existing classical convolutional neural networks (CNNs). Several problems existed for those gated neural network architectures. In this research, some of them are described. Then, to solve those problems, a further optimized gated architecture, a gated CNN with auxiliary paths, was proposed. The major focus of this thesis work is on auxiliary loss weighting, a technique to further regulate the gated CNNs with auxiliary paths and improve their performance. The CAD-60 dataset is utilized as a benchmark to demonstrate the significant performance improvements through the proposed architecture and its robustness in the presence of sensor noise and failures.
Li, Yang (2019). Auxiliary Loss Weighting for Robust Multi-Modal Sensor Fusion with Deep Neural Networks. Master's thesis, Texas A&M University. Available electronically from