Joint calibration of intrinsic and extrinsic parameters for LiDAR-camera systems in targetless environments. This work aims to calibrate camera intrinsic and LiDAR-camera extrinsic parameters even without a chessboard, while maintaining comparable accuracy with target-based methods. Our method merely requires several textured planes in the scene. As textured planes are ubiquitous in urban environments, this method enjoys broad usability across diverse calibration scenes. A detailed description of the method can be found in our paper. The calibration pipeline is summarized in the following figure.
Ubuntu
ROS
OpenCV
PCL
Eigen
Ceres Solver
Clone the repository and catkin_make:
cd ~/$A_ROS_DIR$/src
git clone https://github.com/hku-mars/joint-lidar-camera-calib.git
cd ..
catkin_make
source devel/setup.bash
This calibration method works for both solid-state and mechanically spinning LiDARs. In terms of cameras, current version supports the pinhole model with radical and tangential distortion.
Data can be downloaded from this link.