Extrinsic LiDAR-Robot And LiDAR-Camera Calibration For Multi-sensor Fusion
Abstract
In mobile robotics, an essential requirement for the fusion of different sensors is that measurements are expressed with respect to the same reference. In this sense, the transformation between sensors and robot is necessary to ensure better sensor fusion. Therefore, this article proposes an extrinsic sensor calibration based on markers and associated to three orthogonal planes. This technique is applied to two calibration approaches, LiDAR-Robot and LiDAR-Camera. The first one calculates the transformation between a 3D LiDAR sensor and a robot, and the second system calculates the transformation between a 3D LiDAR and an embedded RGB camera. To demonstrate the efficiency of our method, we performed simulations on the CoppeliaSim simulator and experiments in the laboratory. Then, the results show that it is possible to calibrate the sensors with the methodologies.
