VIW-Fusion is an optimization-based viusla-inertial-wheel fusion odometry, which is developed as a part of my master thesis. First of all, I need to thank HKUST Aerial Robotics Group led by Prof. Shaojie Shen for their outstanding work VINS-Fusion. VIW-Fusion is developed based on VINS-Fusion. Features:
- multiple sensors support (stereo cameras+IMU+wheel / mono camera+IMU+wheel)
- wheel enhanced visual-inertial initialization
- online spatial calibration (transformation between camera, IMU and wheel)
- online wheel intrinsic calibration
- online temporal calibration (time offset between camera, IMU and wheel)
- plane constraint
We tested Mono VIWO in scenes with drastic changes in light, and the parameters between different scenes remained unchanged. The video is below, if you can't access youtube, please try bilibili: We compare the VIWO with Mono VIO. The trajectory estimated by Mono VIO is completely different from the real trajectory, while the trajectory estimated by Mono VIWO is in good agreement with the real trajectory.
We align the trajectory with the entrance door and exit door to further compare VIWO and Stereo VIO, the trajectory estimated by Mono VIWO is also more reasonable.
Ubuntu 64-bit 16.04 or 18.04. ROS Kinetic or Melodic. ROS Installation
Follow Ceres Installation.
Version: 1.4.0
git clone https://github.com/strasdat/Sophus.git
cd ./Sophus/
git checkout a0fe89a323e20c42d3cecb590937eb7a06b8343a
mkdir build
cd ./build
cmake ..
make
sudo make install
Clone the repository and catkin_make:
cd ~/catkin_ws/src
git clone https://github.com/TouchDeeper/VIW-Fusion.git
cd ../
catkin_make
source ~/catkin_ws/devel/setup.bash
(if you fail in this step, try to find another computer with clean system or reinstall Ubuntu and ROS)
Download dataset here.
roslaunch vins vins_rviz.launch
rosrun vins viwo_node ~/catkin_ws/src/VIW-Fusion/config/realsense_d435i/realsense_stereo_imu_config_ridgeback.yaml
(optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VIW-Fusion/config/realsense_d435i/realsense_stereo_imu_config_ridgeback.yaml
rosbag play YOUR_DATASET_FOLDER/ridgeback_dark.bag
If you use this package for your research, a footnote with the link to this repository is appreciated: github.com/TouchDeeper/VIW-Fusion
, or for citation with BibTeX:
@misc{ztd2021viwo,
title={VIW-Fusion: visual-inertial-wheel fusion odometry.},
author={Tingda Zhuang},
howpublished={\url{https://github.com/TouchDeeper/VIW-Fusion}},
year={2021}
}
-------------------- separation line----------------------------
VINS-Fusion is an optimization-based multi-sensor state estimator, which achieves accurate self-localization for autonomous applications (drones, cars, and AR/VR). VINS-Fusion is an extension of VINS-Mono, which supports multiple visual-inertial sensor types (mono camera + IMU, stereo cameras + IMU, even stereo cameras only). We also show a toy example of fusing VINS with GPS. Features:
- multiple sensors support (stereo cameras / mono camera+IMU / stereo cameras+IMU)
- online spatial calibration (transformation between camera and IMU)
- online temporal calibration (time offset between camera and IMU)
- visual loop closure
We are the top open-sourced stereo algorithm on KITTI Odometry Benchmark (12.Jan.2019).
Authors: Tong Qin, Shaozu Cao, Jie Pan, Peiliang Li, and Shaojie Shen from the Aerial Robotics Group, HKUST
Videos:
Related Paper: (paper is not exactly same with code)
-
Online Temporal Calibration for Monocular Visual-Inertial Systems, Tong Qin, Shaojie Shen, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS, 2018), best student paper award pdf
-
VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator, Tong Qin, Peiliang Li, Shaojie Shen, IEEE Transactions on Robotics pdf
If you use VINS-Fusion for your academic research, please cite our related papers. bib
Ubuntu 64-bit 16.04 or 18.04. ROS Kinetic or Melodic. ROS Installation
Follow Ceres Installation.
Clone the repository and catkin_make:
cd ~/catkin_ws/src
git clone https://github.com/HKUST-Aerial-Robotics/VINS-Fusion.git
cd ../
catkin_make
source ~/catkin_ws/devel/setup.bash
(if you fail in this step, try to find another computer with clean system or reinstall Ubuntu and ROS)
Download EuRoC MAV Dataset to YOUR_DATASET_FOLDER. Take MH_01 for example, you can run VINS-Fusion with three sensor types (monocular camera + IMU, stereo cameras + IMU and stereo cameras). Open four terminals, run vins odometry, visual loop closure(optional), rviz and play the bag file respectively. Green path is VIO odometry; red path is odometry under visual loop closure.
roslaunch vins vins_rviz.launch
rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_mono_imu_config.yaml
(optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_mono_imu_config.yaml
rosbag play YOUR_DATASET_FOLDER/MH_01_easy.bag
roslaunch vins vins_rviz.launch
rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_stereo_imu_config.yaml
(optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_stereo_imu_config.yaml
rosbag play YOUR_DATASET_FOLDER/MH_01_easy.bag
roslaunch vins vins_rviz.launch
rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_stereo_config.yaml
(optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_stereo_config.yaml
rosbag play YOUR_DATASET_FOLDER/MH_01_easy.bag
Download KITTI Odometry dataset to YOUR_DATASET_FOLDER. Take sequences 00 for example, Open two terminals, run vins and rviz respectively. (We evaluated odometry on KITTI benchmark without loop closure funtion)
roslaunch vins vins_rviz.launch
(optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml
rosrun vins kitti_odom_test ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml YOUR_DATASET_FOLDER/sequences/00/
Download KITTI raw dataset to YOUR_DATASET_FOLDER. Take 2011_10_03_drive_0027_synced for example. Open three terminals, run vins, global fusion and rviz respectively. Green path is VIO odometry; blue path is odometry under GPS global fusion.
roslaunch vins vins_rviz.launch
rosrun vins kitti_gps_test ~/catkin_ws/src/VINS-Fusion/config/kitti_raw/kitti_10_03_config.yaml YOUR_DATASET_FOLDER/2011_10_03_drive_0027_sync/
rosrun global_fusion global_fusion_node
Download car bag to YOUR_DATASET_FOLDER. Open four terminals, run vins odometry, visual loop closure(optional), rviz and play the bag file respectively. Green path is VIO odometry; red path is odometry under visual loop closure.
roslaunch vins vins_rviz.launch
rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/vi_car/vi_car.yaml
(optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/vi_car/vi_car.yaml
rosbag play YOUR_DATASET_FOLDER/car.bag
VIO is not only a software algorithm, it heavily relies on hardware quality. For beginners, we recommend you to run VIO with professional equipment, which contains global shutter cameras and hardware synchronization.
Write a config file for your device. You can take config files of EuRoC and KITTI as the example.
VINS-Fusion support several camera models (pinhole, mei, equidistant). You can use camera model to calibrate your cameras. We put some example data under /camera_models/calibrationdata to tell you how to calibrate.
cd ~/catkin_ws/src/VINS-Fusion/camera_models/camera_calib_example/
rosrun camera_models Calibrations -w 12 -h 8 -s 80 -i calibrationdata --camera-model pinhole
To further facilitate the building process, we add docker in our code. Docker environment is like a sandbox, thus makes our code environment-independent. To run with docker, first make sure ros and docker are installed on your machine. Then add your account to docker
group by sudo usermod -aG docker $YOUR_USER_NAME
. Relaunch the terminal or logout and re-login if you get Permission denied
error, type:
cd ~/catkin_ws/src/VINS-Fusion/docker
make build
Note that the docker building process may take a while depends on your network and machine. After VINS-Fusion successfully built, you can run vins estimator with script run.sh
.
Script run.sh
can take several flags and arguments. Flag -k
means KITTI, -l
represents loop fusion, and -g
stands for global fusion. You can get the usage details by ./run.sh -h
. Here are some examples with this script:
# Euroc Monocualr camera + IMU
./run.sh ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_mono_imu_config.yaml
# Euroc Stereo cameras + IMU with loop fusion
./run.sh -l ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_mono_imu_config.yaml
# KITTI Odometry (Stereo)
./run.sh -k ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml YOUR_DATASET_FOLDER/sequences/00/
# KITTI Odometry (Stereo) with loop fusion
./run.sh -kl ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml YOUR_DATASET_FOLDER/sequences/00/
# KITTI GPS Fusion (Stereo + GPS)
./run.sh -kg ~/catkin_ws/src/VINS-Fusion/config/kitti_raw/kitti_10_03_config.yaml YOUR_DATASET_FOLDER/2011_10_03_drive_0027_sync/
In Euroc cases, you need open another terminal and play your bag file. If you need modify the code, simply re-run ./run.sh
with proper auguments after your changes.
We use ceres solver for non-linear optimization and DBoW2 for loop detection, a generic camera model and GeographicLib.
The source code is released under GPLv3 license.
We are still working on improving the code reliability. For any technical issues, please contact Tong Qin <qintonguavATgmail.com>.
For commercial inquiries, please contact Shaojie Shen <eeshaojieATust.hk>.