/RobustPathFollow

Pytorch implementation of "Visual Memory for Robust Path Following", NIPS18

Primary LanguagePython

RobustPathFollow

Pytorch Implementation of "Visual Memory for Robust Path Following" by Nuri Kim, Obin Kwon, and Hwiyeon Yoo.

Results

  • Our implementation:
    • Success rate: 86.0%
    • SPL: 77.3%
  • In paper:
    • Success rate: 86.6%
    • SPL: 72.6%

We think the difference on performance came from the difference of the dataset. We collected the dataset with our own. The RPF (Robust Path Following) model is trained with over 40000 episodes and tested with over 1000 episodes of habitat-sim pathfollow dataset.

Prerequisites

  • A basic Pytorch installation. I used pytorch 1.3.1
  • tensorboardX installation.
  • (For test) Habitat simulator (Habitat sim).

Installation

  1. Clone the repository
git clone https://github.com/bareblackfoot/RobustPathFollow.git

Setup data

  1. Download habitat-sim pathfollow test dataset

    • Google drive here
    • Email me to get training dataset
  2. Create a folder and a soft link to use the dataset

mkdir data
cd data
ln -s path/to/downloaded/dataset/pathfollow .
cd ..
  1. Download Habitat sim scene dataset

    • The full Matterport3D (MP3D) dataset for use with Habitat can be downloaded using the official Matterport3D download script as follows: python download_mp.py --task habitat -o path/to/download/. You only need the habitat zip archive and not the entire Matterport3D dataset. Note that this download script requires python 2.7 to run.
  2. Create a folder and a soft link to use the scene dataset

cd data
ln -s path/to/downloaded/dataset/scene_dataset .
cd ..
  1. Make sure your directory structure looks like:
  • RobustPathFollow
    • data
      • pathfollow
        • train
          • demo
          • follower
        • test
          • demo
          • follower
        • valid
          • demo
          • follower
      • scene_datasets
        • mp3d
    • outputs
      • rpf_nuri
        • best.pth

Test with pre-trained models

  1. Download pre-trained model
  • Google drive here.
  1. Locate the model inside the outputs/rpf_nuri
mv path/to/best.pth ./outputs/rpf_nuri
  1. Test with pre-trained rpf models
GPU_ID=0
CUDA_VISIBLE_DEVICES={GPU_ID} python eval.py

Train your own model

  1. You can train your model with the habitat-sim training dataset (Email me to get it!):
GPU_ID=0
CUDA_VISIBLE_DEVICES={GPU_ID} python train.py

By default, trained networks are saved under:

outputs/default/

Test outputs are saved under:

outputs/default/

Tensorboard information for train and validation is saved under:

experiments/tb_logs/default/