/StableNormal

[SIGGRAPH Asia 2024 (Journal Track)] StableNormal: Reducing Diffusion Variance for Stable and Sharp Normal

Primary LanguagePythonApache License 2.0Apache-2.0

StableNormal: Reducing Diffusion Variance for Stable and Sharp Normal

Chongjie Ye*, Lingteng Qiu*, Xiaodong Gu, Qi Zuo, Yushuang Wu, Zilong Dong, Liefeng Bo, Yuliang Xiu#, Xiaoguang Han#

* Equal contribution
# Corresponding Author

SIGGRAPH Asia 2024 (Journal Track)

Website Paper Hugging Face Space Hugging Face Model License

We propose StableNormal, which tailors the diffusion priors for monocular normal estimation. Unlike prior diffusion-based works, we focus on enhancing estimation stability by reducing the inherent stochasticity of diffusion models ( i.e. , Stable Diffusion). This enables β€œStable-and-Sharp” normal estimation, which outperforms multiple baselines (try Compare), and improves various real-world applications (try Demo).

teaser

News

πŸŽ‰ New Release: StableDelight πŸŽ‰

We're excited to announce the release of StableDelight, our latest open-source project focusing on real-time reflection removal from textured surfaces. Check out the StableDelight for more details! image

Installation:

Please run following commands to build package:

git clone https://github.com/Stable-X/StableNormal.git
cd StableNormal
pip install -r requirements.txt

or directly build package:

pip install git+https://github.com/Stable-X/StableNormal.git

Usage

To use the StableNormal pipeline, you can instantiate the model and apply it to an image as follows:

import torch
from PIL import Image

# Load an image
input_image = Image.open("path/to/your/image.jpg")

# Create predictor instance
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal", trust_repo=True)

# Apply the model to the image
normal_image = predictor(input_image)

# Save or display the result
normal_image.save("output/normal_map.png")

Additional Options:

  • If you need faster inference(10 times faster), use StableNormal_turbo:
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal_turbo", trust_repo=True)
  • If Hugging Face is not available from terminal, you could download the pretrained weights to weights dir:
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal", trust_repo=True, local_cache_dir='./weights')

Compute Metrics:

This section provides guidance on evaluating your normal predictor using the DIODE dataset.

Step 1: Prepare Your Results Folder

First, make sure you have generated a normal map and structured your results folder as shown below:

β”œβ”€β”€ YOUR-FOLDER-NAME 
β”‚   β”œβ”€β”€ scan_00183_00019_00183_indoors_000_010_gt.png
β”‚   β”œβ”€β”€ scan_00183_00019_00183_indoors_000_010_init.png
β”‚   β”œβ”€β”€ scan_00183_00019_00183_indoors_000_010_ref.png
β”‚   β”œβ”€β”€ scan_00183_00019_00183_indoors_000_010_step0.png
β”‚   β”œβ”€β”€ scan_00183_00019_00183_indoors_000_010_step1.png
β”‚   β”œβ”€β”€ scan_00183_00019_00183_indoors_000_010_step2.png
β”‚   β”œβ”€β”€ scan_00183_00019_00183_indoors_000_010_step3.png

Step 2: Compute Metric Values

Once your results folder is set up, you can compute the metrics for your normal predictions by running the following scripts:

# compute metrics
python ./stablenormal/metrics/compute_metric.py -i ${YOUR-FOLDER-NAME}

# compute variance
python ./stablenormal/metrics/compute_variance.py -i ${YOUR-FOLDER-NAME}

Replace ${YOUR-FOLDER-NAME}; with the actual name of your results folder. Following these steps will allow you to effectively evaluate your normal predictor's performance on the DIODE dataset.

Citation

@article{ye2024stablenormal,
  title={StableNormal: Reducing Diffusion Variance for Stable and Sharp Normal},
  author={Ye, Chongjie and Qiu, Lingteng and Gu, Xiaodong and Zuo, Qi and Wu, Yushuang and Dong, Zilong and Bo, Liefeng and Xiu, Yuliang and Han, Xiaoguang},
  journal={ACM Transactions on Graphics (TOG)},
  year={2024},
  publisher={ACM New York, NY, USA}
}