TY - JOUR
T1 - Advancing autonomous SLAM systems
T2 - Integrating YOLO object detection and enhanced loop closure techniques for robust environment mapping
AU - Ul Islam, Qamar
AU - Khozaei, Fatemeh
AU - Salah Al Barhoumi, El Manaa
AU - Baig, Imran
AU - Ignatyev, Dmitry
N1 - Publisher Copyright:
© 2024 Elsevier B.V.
PY - 2024/12/17
Y1 - 2024/12/17
N2 - This research paper introduces an enhanced method for visual Simultaneous Localization and Mapping (SLAM), specifically designed for dynamic environments. Our approach distinguishes itself from traditional visual SLAM methods by integrating feature-based techniques with a lightweight object identification network known as You Only Look Once (YOLO). This integration allows for the extraction of semantic information, enhancing the system's performance. Furthermore, we incorporate sparse optical flow and advanced multi-view geometry to improve the accuracy of localization and mapping. A significant innovation in our method is the introduction of an improved loop detection algorithm, which optimizes mapping in complex settings. Our system is built upon the foundation of Oriented Features from Accelerated Segment Test (FAST) and Rotated BRIEF-SLAM3 (Binary Robust Independent Elementary Features-Simultaneous Localization and Mapping 3 or ORB-SLAM3), enabling real-time performance and demonstrating superior localization accuracy in dynamic environments. We conducted extensive experiments using public datasets, which show that our proposed system surpasses existing deep learning-based visual SLAM systems. It reduces the absolute trajectory error in dynamic scenarios and enhances mapping accuracy and robustness in complex environments. This system overcomes the limitations of traditional visual SLAM methods and emerges as a promising solution for real-world applications such as autonomous driving and advanced driver assistance systems. The technical novelty of our approach lies in the strategic integration of various innovative techniques, making it a significant advancement over existing methods.
AB - This research paper introduces an enhanced method for visual Simultaneous Localization and Mapping (SLAM), specifically designed for dynamic environments. Our approach distinguishes itself from traditional visual SLAM methods by integrating feature-based techniques with a lightweight object identification network known as You Only Look Once (YOLO). This integration allows for the extraction of semantic information, enhancing the system's performance. Furthermore, we incorporate sparse optical flow and advanced multi-view geometry to improve the accuracy of localization and mapping. A significant innovation in our method is the introduction of an improved loop detection algorithm, which optimizes mapping in complex settings. Our system is built upon the foundation of Oriented Features from Accelerated Segment Test (FAST) and Rotated BRIEF-SLAM3 (Binary Robust Independent Elementary Features-Simultaneous Localization and Mapping 3 or ORB-SLAM3), enabling real-time performance and demonstrating superior localization accuracy in dynamic environments. We conducted extensive experiments using public datasets, which show that our proposed system surpasses existing deep learning-based visual SLAM systems. It reduces the absolute trajectory error in dynamic scenarios and enhances mapping accuracy and robustness in complex environments. This system overcomes the limitations of traditional visual SLAM methods and emerges as a promising solution for real-world applications such as autonomous driving and advanced driver assistance systems. The technical novelty of our approach lies in the strategic integration of various innovative techniques, making it a significant advancement over existing methods.
KW - Advanced multi-view geometric analysis
KW - Artificial intelligence
KW - Deep learning in dynamic environments
KW - Loop detection algorithm improvement
KW - Semantic information extraction
KW - Visual simultaneous localization and mapping
UR - http://www.scopus.com/inward/record.url?scp=85212080695&partnerID=8YFLogxK
U2 - 10.1016/j.robot.2024.104871
DO - 10.1016/j.robot.2024.104871
M3 - Article
AN - SCOPUS:85212080695
SN - 0921-8890
VL - 185
JO - Robotics and Autonomous Systems
JF - Robotics and Autonomous Systems
M1 - 104871
ER -