/Lidar-SCU

Supplementary material to our paper in the International Journal of Applied Earth Observation and Geoinformation

Primary LanguageC++GNU General Public License v3.0GPL-3.0

Point Cloud Registration and Change Detection in Urban Environment Using an Onboard Lidar Sensor and MLS Reference Data

Supplementary material to our paper in the International Journal of Applied Earth Observation and Geoinformation, 2022

Left: MLS point cloud obtained in Kálvin square, Budapest, Hungary.
Right: Point cloud frame captured by a Velodyne HDL64E sensor in Kálvin square, Budapest, Hungary.

Dataset

The full dataset used in the paper can be downloaded from this link.
The description of the dataset and its usage is available here.

Citation

If you found this work helpful for your research, or use some part of the code, please cite our paper:

@article{lidar-scu,
	title = {Point cloud registration and change detection in urban environment using an onboard Lidar sensor and MLS reference data},
	journal = {International Journal of Applied Earth Observation and Geoinformation},
	volume = {110},
	pages = {102767},
	year = {2022},
	issn = {1569-8432},
	doi = {https://doi.org/10.1016/j.jag.2022.102767},
	url = {https://www.sciencedirect.com/science/article/pii/S0303243422000939},
	author = {Örkény Zováthi and Balázs Nagy and Csaba Benedek},
}

Installation guide

The code was written in C++ and tested on a desktop computer with Ubuntu 18.04. Please carefully follow our installation guide.

Dependencies:

PCL-1.8.1
Eigen3
OpenCV
CMake

Install PCL and Eigen3:

$ sudo apt install libeigen3-dev libpcl-dev

Build OpenCV from source:

Please follow the instructions of this tutorial.

Setting up the project:

  1. Clone this repository
  2. Build the project:
$ mkdir build && cd build
$ cmake ..
$ make -j10
  1. Run the project on sample demo data:
$ ./Lidar-SCU

Required data-structure:

|--project_root:
   |--Data:
      |--Samples:
         |--Velo1.pcd
         |--MLS1.pcd
         |--Velo2.pcd
         |--MLS2.pcd
         |--Velo3.pcd
         |--MLS3.pcd
         |--Velo4.pcd
         |--MLS4.pcd
         |--Velo5.pcd
         |--MLS5.pcd
      |--Output:

Simple usage:

Run the algorithms on your own sample demo data pair:

$ ./Lidar-SCU path_to_rmb_lidar_frame.pcd path_to_mls_data.pcd

By default, the execution follows the steps below. At each step, one should see similar outputs as listed here. Please close each figure to go to the next step.

1. Landmark object extraction from dense MLS point clouds

Expected output:

Color code: red = pillar-like columns, orange = other static objects

2. Object detection in sparse Lidar scans

Expected output:

Color code: red = pillar-like object candidates

3. Bounding-box-based coarse registration

Expected output:

Color code: red = RMB Lidar, blue = MLS map
Left: before alignment
Right: after alignment

4. Standard ICP alignment

Expected output:

Color code: red = RMB Lidar, blue = MLS map

5. Range-image generation and MRF-segmentation

Expected output:

6. Change detection in 3D

Expected output:

Color code: blue = static, red = dynamic change, blue = vegetation change

For more details, please address our paper.

Authorship declaration

The code of this repository was implemented in the Machine Perception Research Laboratory, Institute of Computer Science and Control (SZTAKI), Budapest.