Oxford Spires Dataset

This repository contains scripts that are used to evaluate localisation, 3D reconstruction and radiance field methods using the Oxford Spires Dataset.

This is a pre-release of the software. The codebase will be refactored in the near future. Please feel free to ask questions about the dataset and report bugs in the Github Issues.

Localisation Benchmark

The localisation benchmark runs LiDAR SLAM methods (Fast-LIO-SLAM, SC-LIO-SAM, ImMesh) and LiDAR Bundle Adjustment method (HBA). The resultant trajectory are evaluated against the ground truth trajectory using evo.

Build the docker container and run the methods:

cd oxford_spires_dataset
docker compose -f .docker_loc/docker-compose.yml run --build spires

# in the docker
python scripts/localisation_benchmark/colmap.py
python scripts/localisation_benchmark/fast_lio_slam.py
python scripts/localisation_benchmark/immesh.py
python scripts/localisation_benchmark/vilens_hba.py

Reconstruction Benchmark

The reconstruction benchmark runs Structure-from-Motion (COLMAP), Multi-view Stereo (OpenMVS), radiance field methods (Nerfstudio's Nerfacto and Splatfacto), and generates 3D point cloud reconstruction, which is evaluated against the TLS-captured ground truth 3D point cloud.

Build the docker container and run the methods:

cd oxford_spires_dataset
docker compose -f .docker/docker-compose.yml run --build spires

# inside the docker
python scripts/reconstruction_benchmark/main.py --config-file config/recon_benchmark.yaml

Novel-view Synthesis Benchmark

Currently, the NVS benchmakr is included in the reconstruction benchmark script, since it builds upon output from COLMAP.

Contributing

Please refer to Angular's guide for contributing(https://github.com/angular/angular/blob/22b96b96902e1a42ee8c5e807720424abad3082a/CONTRIBUTING.md).

Formatting

We use Ruff as the formatter and linter for Python, and Clang for C++. Installing the pre-commit will format and lint your code before you commit:

$ pip install pre-commit
$ pre-commit install