/HPointLoc

Open dataset and framework for visual place recognition and localization

Primary LanguageJupyter NotebookMIT LicenseMIT

HPointLoc: open dataset and framework for indoor visual localizationbased on synthetic RGB-D images

License: MIT

This repository provides a novel framework PNTR for exploring capabilities of new indoor datasets - HPointLoc, specially designed to explore detection and loop closure capabilities in Simultaneous Localization and Mapping (SLAM).

HPointLoc is based on the popular Habitat simulator from 49 photorealistic indoor scenes from the Matterport3D dataset and contains 76,000 frames.

When forming the dataset, considerable attention was paid to the presence of instance segmentation of scene objects, which will allow it to be used in new emerging semantic methods for place recognition and localization

The dataset is split into two parts: the validation HPointLoc-Val, which contains only one scene, and the complete HPointLoc-All dataset, containing all 49 scenes, including HPointLoc-Val

Experimental results

The experiments were held on the HPointLoc-Val and HPointLoc-ALL datasets.

Quick start to evaluate PNTR pipeline

git clone --recurse-submodules https://github.com/cds-mipt/HPointLoc
conda env create -f environment.yml
cd /path/to/dataset
bash download_HPointloc.sh
cd /HPointLoc
python pipelines/pipeline_evaluate.py --dataset_root /path/to/dataset --image-retrieval patchnetvlad --keypoints-matching superpoint_superglue --optimizer-cloud teaser -f --topk 1