/voxfield-panmap

Multi-resolution panoptic volumetric mapingp using Voxfield as its mapping backbone [IROS' 22]

Primary LanguageC++BSD 3-Clause "New" or "Revised" LicenseBSD-3-Clause

Voxfield Panmap

Multi-resolution panoptic volumetric maping based on panoptic_mapping using Voxfield as its mapping backbone. Please check our Voxfield paper for more details.

Paper and Video

1-min demo video | 7-min presentation | paper

Installation

It's the same as the original panmap. Please check the instructions from here.

Build the repository by:

catkin build panoptic_mapping_utils

Example Usage

Configure the parameters in the .yaml files in ./panoptic_mapping_ros/config/mapper/.

And then launch the mapper by:

roslaunch panoptic_mapping_ros run_[xxx_dataset].launch

Our Voxfield Panmap supports both the RGB-D and LiDAR input and provides some example experiments on datasets including ASL Flat, Cow and Lady, (Semantic) KITTI, Semantic USL, MaiCity, Newer College, etc.

Here, we provide the link to the exmaple rosbag for Semantic KITTI (2.9GB) and Semantic USL (0.9GB) dataset for testing.

Demo on outdoor LiDAR datasets

Citation

If you find this code useful for your work or use it in your project, please consider citing the following papers:

@inproceedings{pan2022iros,
  title={Voxfield: non-Projective Signed Distance Fields for Online Planning and 3D Reconstruction},
  author={Yue Pan, Yves Kompis, Luca Bartolomei, Ruben Mascaro, Cyrill Stachniss, Margarita Chli},
  booktitle={Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS)},
  year={2022}
}

@inproceedings{schmid2022icra,
  title={Panoptic Multi-TSDFs: a Flexible Representation for Online Multi-resolution Volumetric Mapping and Long-term Dynamic Scene Consistency},
  author={Schmid, Lukas and Delmerico, Jeffrey and Sch{\"o}nberger, Johannes and Nieto, Juan and Pollefeys, Marc and Siegwart, Roland and Cadena, Cesar},
  booktitle={Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)},
  year={2022}
}

Acknowledgments

We thanks greatly for the authors of the following opensource projects: