/papr

Original reference implementation of "PAPR: Proximity Attention Point Rendering"

Primary LanguagePythonMIT LicenseMIT

PAPR: Proximity Attention Point Rendering (NeurIPS 2023 Spotlight 🤩)

Yanshu Zhang*, Shichong Peng*, Alireza Moazeni, Ke Li (* denotes equal contribution)

Project Sites | Paper | Primary contact: Yanshu Zhang

Proximity Attention Point Rendering (PAPR) is a new method for joint novel view synthesis and 3D reconstruction. It simultaneously learns from scratch an accurate point cloud representation of the scene surface, and an attention-based neural network that renders the point cloud from novel views.

NeurIPS 2023 Presentation

BibTeX

PAPR: Proximity Attention Point Rendering.    

@inproceedings{zhang2023papr,
    title={PAPR: Proximity Attention Point Rendering},
    author={Yanshu Zhang and Shichong Peng and Seyed Alireza Moazenipourasil and Ke Li},
    booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
    year={2023}
}

Installation

git clone git@github.com:zvict/papr.git   # or 'git clone https://github.com/zvict/papr'
cd papr
conda env create -f papr.yml
conda activate papr

Data Preparation

Expected dataset structure in the source path location:

papr
├── data
│   ├── nerf_synthetic
│   │   ├── chair
│   │   │   ├── train
│   │   │   ├── val
│   │   │   ├── test
│   │   │   ├── transforms_train.json
│   │   │   ├── transforms_val.json
│   │   │   ├── transforms_test.json
│   │   ├── ...
│   ├── tanks_temples
│   │   ├── Barn
│   │   │   ├── pose
│   │   │   ├── rgb
│   │   │   ├── intrinsics.txt
│   │   ├── ...

NeRF Synthetic

Download NeRF Synthetic Dataset from here and put it under data/nerf_synthetic/

Tanks & Temples

Download Tanks&Temples from here and put it under: data/tanks_temples/

Overview

The codebase has two main components: data loading part in dataset/ and models in models/. Class PAPR in models/model.py defines our main model. All the configurations are in configs/, and configs/demo.yml is a demo configuration with comments of important arguments.

Training

python train.py --opt configs/nerfsyn/chair.yml

Evaluation

python test.py --opt configs/nerfsyn/chair.yml

Pretrained Models

We provide pretrained models on NeRF Synthetic and Tanks&Temples datasets here: Google Drive. To load the pretrained models, please put them under checkpoints/, and change the test.load_path in the config file.

Acknowledgement

This research was enabled in part by support provided by NSERC, the BC DRI Group and the Digital Research Alliance of Canada.