Adaptive Localization Algorithm for Wall Climbing Robot in Tank Environment- Based on Sensor Fusion and Self-Calibration

Main Article Content

Chuansheng Zhao, Yike Xiao, Hongxia Zhao


In constrained sensing environments like enclosed or magnetically disrupted spaces, wall-climbing robots often grapple with accumulating errors in position and orientation over time. To tackle this challenge, this study introduces a fresh approach called the difference projection localization method, which harnesses an external RGB-D camera and an inertial measurement unit (IMU) mounted on the robot. The method entails discerning changes in depth from the image to track variations in distance caused by the robot's presence. It transforms 3D point cloud data into 2D image data by projecting distances along the robot chassis's normal vector, significantly boosting computational efficiency. The robot's position is determined by analyzing the statistical properties of this projection. Furthermore, two Extended Kalman Filters (EKFs) are devised to estimate the robot's orientation, utilizing observations from both the gravity vector and the chassis's normal vector. Experimental results validate the effectiveness of the proposed localization method, achieving a positioning error of just 0.017m and an attitude estimation heading angle error of 3.1° for the wall-climbing robot. These outcomes underscore the method's efficacy in enabling precise self-localization of wall-climbing robots, especially in tasks demanding fine manipulation and precise positioning of industrial manipulators. The paper discusses the importance of self-calibration techniques in mitigating positioning errors for climbing robots, drawing on the 3DCLIMBER as a pertinent case study developed at ISR-UC. Initial tests of the robot highlight the criticality of accurate gripper positioning for ensuring autonomous climbing processes, stressing the importance of error measurement and compensation methods such as the proposed self-calibrating approach.

Article Details