/cou_sfm

This is the source code for continual, online, unsupervised Structure from Motion-based depth estimation.

Primary LanguagePython

Towards Continual, Online, Unsupervised Depth

Introduction

This is the source code for the paper Towards Continual, Online, Unsupervised Depth. This code is for Structure from Motion (SfM)-based depth estimation.

Manuscript is available here.

The stereo-based depth estimation is also available at here.

Requirements

  • PyTorch
  • Torchvision
  • NumPy
  • Matplotlib
  • OpenCV
  • Torchvision
  • Pandas
  • Tensorboard

KITTI-NYU Experiments

Data Preparation

Download the KITTI dataset, the rectified NYU, the KITTI test dataset and the NYU test dataset. Extract data to appropriate locations. Saving in SSD is encouraged but not required. Virtual KITTI is not required for the KITTI-NYU experiments. Similarly, NYU is not required for KITTI-vKITTI experiments. The directory names are slightly changed (by adding an underscore) for the NYU dataset for categorization of the code.

Pre-Training

Set paths in the dir_options/pretrain_options.py file. Then run

python pretrain.py

The pre-trained models should be saved in the directory trained_models/pretrained_models/.

Testing

Set paths in dir_options/test_options.py file. Run

python script_evaluate.py

to see the results in the console. To test the online models, run

python script_test_directory.py

Online Training

Set paths in the dir_options/online_train_options.py file. Then run

python script_online_train.py

The online-trained models (for a single epoch only) will be saved in the trained_models directory. Intermediate results will be saved in the qual_dmaps directory.

Results

Check this following video for qualitative results.

The Absolute Relative metric is shown in the following table.

Training Dataset Approach Current Dataset Other Dataset
KITTI Fine-tuning 0.1895 0.3504
KITTI Proposed 0.1543 0.1952
NYU Fine-tuning 0.2430 0.3336
NYU Proposed 0.1872 0.1624

See the following figure for comparison.

figs directory

KITTI-vKITTI Experiments

Data Preparation

Download the KITTI dataset, the Virtual KITTI RGB, and the KITTI test dataset.Extract data to appropriate locations. Saving SSD is encouraged but not required. Virtual KITTI is not required for the KITTI-NYU experiments. Similarly, NYU is not required for KITTI-vKITTI experiments.

Evaluation

Set the paths in dir_options/test_options.py. Then run

python script_test_vkitti_exp.py

Online Traning

Set the paths in dir_options/online_train_options.py. Then run the following

python script_vkitti_exp.py

Pre-Training

Set the paths in dir_options/pretrain_options.py. Then run the following

python script_kitti_pretrain.py