Lu Chen
A Novel Multi-Sensor Nonlinear Tightly-Coupled Framework for Composite Robot Localization and Mapping
Chen, Lu; Hussain, Amir; Liu, Yu; Tan, Jie; Li, Yang; Yang, Yuhao; Ma, Haoyuan; Fu, Shenbing; Li, Gun
Authors
Prof Amir Hussain A.Hussain@napier.ac.uk
Professor
Yu Liu
Jie Tan
Yang Li
Yuhao Yang
Haoyuan Ma
Shenbing Fu
Gun Li
Abstract
Composite robots often encounter difficulties due to changes in illumination, external disturbances, reflective surface effects, and cumulative errors. These challenges significantly hinder their capabilities in environmental perception and the accuracy and reliability of pose estimation. We propose a nonlinear optimization approach to overcome these issues to develop an integrated localization and navigation framework, IIVL-LM (IMU, Infrared, Vision, and LiDAR Fusion for Localization and Mapping). This framework achieves tightly coupled integration at the data level using inputs from an IMU (Inertial Measurement Unit), an infrared camera, an RGB (Red, Green and Blue) camera, and LiDAR. We propose a real-time luminance calculation model and verify its conversion accuracy. Additionally, we designed a fast approximation method for the nonlinear weighted fusion of features from infrared and RGB frames based on luminance values. Finally, we optimize the VIO (Visual-Inertial Odometry) module in the R3LIVE++ (Robust, Real-time, Radiance Reconstruction with LiDAR-Inertial-Visual state Estimation) framework based on the infrared camera’s capability to acquire depth information. In a controlled study, using a simulated indoor rescue scenario dataset, the IIVL-LM system demonstrated significant performance enhancements in challenging luminance conditions, particularly in low-light environments. Specifically, the average RMSE ATE (Root Mean Square Error of absolute trajectory Error) improved by 23% to 39%, with reductions from 0.006 to 0.013. At the same time, we conducted comparative experiments using the publicly available TUM-VI (Technical University of Munich Visual-Inertial Dataset) without the infrared image input. It was found that no leading results were achieved, which verifies the importance of infrared image fusion. By maintaining the active engagement of at least three sensors at all times, the IIVL-LM system significantly boosts its robustness in both unknown and expansive environments while ensuring high precision. This enhancement is particularly critical for applications in complex environments, such as indoor rescue operations.
Citation
Chen, L., Hussain, A., Liu, Y., Tan, J., Li, Y., Yang, Y., Ma, H., Fu, S., & Li, G. (2024). A Novel Multi-Sensor Nonlinear Tightly-Coupled Framework for Composite Robot Localization and Mapping. Sensors, 24(22), Article 7381. https://doi.org/10.3390/s24227381
Journal Article Type | Article |
---|---|
Acceptance Date | Nov 16, 2024 |
Online Publication Date | Nov 19, 2024 |
Publication Date | 2024 |
Deposit Date | Nov 25, 2024 |
Publicly Available Date | Nov 25, 2024 |
Journal | Sensors |
Electronic ISSN | 1424-8220 |
Publisher | MDPI |
Peer Reviewed | Peer Reviewed |
Volume | 24 |
Issue | 22 |
Article Number | 7381 |
DOI | https://doi.org/10.3390/s24227381 |
Keywords | composite robots; multi-sensor fusion; nonlinear tight coupling; SLAM; illuminance conversion |
Files
A Novel Multi-Sensor Nonlinear Tightly-Coupled Framework for Composite Robot Localization and Mapping
(5.5 Mb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
You might also like
MTFDN: An image copy‐move forgery detection method based on multi‐task learning
(2024)
Journal Article
Transition-aware human activity recognition using an ensemble deep learning framework
(2024)
Journal Article
Downloadable Citations
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search