Our paper Physics-based Indirect Illumination for Inverse Rendering has been accepted by 3DV 2024 and this is the official implementation.
- Set up the python environment
conda create -n dip python=3.7
conda activate dip
pip install -r requirement.txt
pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 torchaudio==0.8.1 -f https://download.pytorch.org/whl/torch_stable.html
- Download MII synthetic dataset from Google Drive
Taking the scene hotdog
as an example, the training process is as follows.
-
Pre-train the geometry.
CUDA_VISIBLE_DEVICES=0 python -m torch.distributed.launch --master_port 10000 --nproc_per_node=1 training/exp_runner.py --conf confs_sg/default.conf --data_split_dir [dataset_dir/hotdog] --expname hotdog --trainstage geometry --exp_dir [exp_dir]
-
Jointly optimize geometry, material, and illumination.
CUDA_VISIBLE_DEVICES=0 python -m torch.distributed.launch --master_port 10000 --nproc_per_node=1 training/exp_runner.py --conf confs_sg/default.conf --data_split_dir [dataset_dir/hotdog] --expname hotdog --trainstage DIP --exp_dir [exp_dir] --if_indirect --if_silhouette --unet
CUDA_VISIBLE_DEVICES=0 python -m torch.distributed.launch --master_port 10000 --nproc_per_node=1 scripts/relight.py --conf confs_sg/default.conf --data_split_dir [dataset_dir/hotdog] --expname hotdog --timestamp latest --exp_dir [exp_dir] --trainstage DIP --if_indirect --unet
Just in case I accidentally delete everything with rm...
Acknowledgements: part of our code is inherited from IDR, PhySG, and MII. We are grateful to the authors for releasing their code.