/DepthLight

DepthLight is a novel method to represent estimated spatial lighting with an emissive texture mesh, producing a simple and lightweight 3D representation for photorealistic object relighting.

Primary LanguagePythonMIT LicenseMIT

DepthLight

Raphaël Manus1, 2 · Marc Christie1 · Samuel Boivin1 · Pascal Guehl2

1Inria, IRISA, CNRS, Univ. Rennes  2LIX, Ecole Polytechnique, IP Paris

Paper PDF Project Page

Installation

Update depthlight.yml with correct CUDA version for PyTorch if needed.

git clone --recurse-submodules https://github.com/RaphaelManus/DepthLight
cd DepthLight

conda env create -f depthlight.yml
conda env create -f LANet.yml

conda activate LANet
pip install numpy --upgrade

conda activate depthlight

Usage

We encourage users to structure their data directories in the following way:

- data
    '- input
    |   - img1.jpg
    |   - img2.png
    |   - ...
    '- ldr_pano
    |   - ...
    '- hdr_pano
    |   - ...
    '- usd
    |   - ...

Running the script

python run.py \
  --input <path>
  --type <ldr_lfov | ldr_pano | hdr_pano> \
  --output <path> \
  --fov <fov> \
  --prompt <"optional prompt">

Options:

  • --input or -i: Point it to an image directory storing all interested images
  • --type or -t (optional): By default, expected input type is LDR LFOV images.
  • --output or -o (optional): You can point it to a different directory than the input if needed.
  • --fov or -f (optional): Specify the fov of the inputs, default is 90.0°.
  • --prompt or -p (optional): Specify a prompt to guide the generation, default is "indoor".

For example:

python run.py -i ./data/input -t ldr_lfov -f 90 -p "indoor"

Unreal Engine integration

See the unreal engine folder.