/pre-nerf

A code base of Pre-NeRF. Read our paper for more details.

Primary LanguagePython

Pre-NeRF 360: Enriching Unbounded Appearances for Neural Radiance Fields

The repository contains the code release for paper: Pre-NeRF

How it run?

We highly recommend to use the docker image in /docker. Please make sure that your installation is a GPU friendly.

DATA_DIR=path/to/the/scene/directory
docker run -v $(pwd):/workspace \
           -v /path/to/360_v2_nk:/workspace/360_v2_nk \
           --gpus all --shm-size 24G --name prenerf --rm -it \
           --entrypoint bash -d prenerf/prenerf:latest \
           single_scene_processing.sh $DATA_DIR

Do it yourself?

If you want to run the scripts instead of downloading the data. You can do the following:

bash rsync.sh 

Then, you need to run our multi_scene_processing.sh

bash multi_scene_processing.sh n5k360

Or for a single scene

bash single_scene_processing.sh n5k360/dish_1550705786

Run on your custom data?

Each scene should be as one or more videos in a directory. For example

n5k360l/
├── dish_1550705786 (scene 1)
│   ├── camera_A.h264
│   ├── camera_B.h264
│   ├── camera_C.h264
│   └── camera_D.h264
├── dish_1550705888 (scene 2)
│   ├── camera_A.h264
│   ├── camera_B.h264
│   ├── camera_C.h264
│   └── camera_D.h264
└── dish_1550705939 (scene 3)
    ├── camera_A.h264
    ├── camera_B.h264
    ├── camera_C.h264
    └── camera_D.h264

Or if you have the scene as images, each scene folder should have images folder

n5k360/dish_1550704903/
└── images
   ├── 0001.png
   ├── 0002.png
   ├── 0003.png
   ├── 0004.png
   ├── ...
   ├── 0063.png
   └── 0064.png

How fast is it?

[We] used GNU Parallel, it also shows a better utilisation for the multiprocessing on the GPU. However, the extending it takes 1-2hrs of preparing data.

License & Contact

We release all Pre-NeRF data under the Creative Commons Attribution-NonCommercial-NoDerivatives V4.0 license. You are free to share and adapt this data for any purpose, even commercially. If you found this dataset useful, please consider citing our paper.


@misc{almughrabi2023prenerf,
      title={Pre-NeRF 360: Enriching Unbounded Appearances for Neural Radiance Fields}, 
      author={Ahmad AlMughrabi and Umair Haroon and Ricardo Marques and Petia Radeva},
      year={2023},
      eprint={2303.12234},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

If you have any questions about the Pre-NeRF dataset or paper, please email the authors, or feel free to file a ticket.