/desigen

Official code for paper: Desigen: A Pipeline for Controllable Design Template Generation [CVPR'24]

Primary LanguagePython

Desigen: A Pipeline for Controllable Design Template Generation [CVPR'24]

Requirement

  1. Create a conda enviroment:
conda env create -n desigen python=3.8
conda activate desigen
pip install -r requirements.txt
  1. Download Bastnet checkpoint into saliency folder.
  1. Download meta data from here.

  2. Download and process background images.

  3. Preprocess saliency map for background images.

  4. Move the corresponding background images to the right directory.

Background Generation

Training

cd background/
# config accelerate first
accelerate config
# train the background generator
sh train.sh

More training settings can refer to Diffusers.

Evaluation

Generate background images with the prompts on validation set and evaluate them by the proposed metrics: FID, Salient Ratio and CLIP Score.

The pretrained saliency dectection mode can be download on Basnet and placed on salliency directory.

sh test.sh

Layout Generation

Training

cd layout/
sh train.sh

Evaluation

Compute the layout given ground truth images and save them for further evaluation.

sh test.sh

Pipeline Inference

Designs can be simply generated by the following command:

python pipeline.py \
--prompt "Rose Valentines' Day" \
--mode "background" \
--encoder_path /path/to/encoder \
--decoder_path /path/to/decoder \
--generator_path logs/background-ours

The mode parameter can also be swiched to background (background-only generation), design (design generation) or iteration (iterative refine). A user-input attention reduction mask is also allowed by mask_image_path.

Acknowledgements

Part of our code is borrowed from the following repositories:

  1. Huggingface Diffusers
  2. Layout Trasformer
  3. Basnet

Citation

@misc{weng2024desigen,
      title={Desigen: A Pipeline for Controllable Design Template Generation}, 
      author={Haohan Weng and Danqing Huang and Yu Qiao and Zheng Hu and Chin-Yew Lin and Tong Zhang and C. L. Philip Chen},
      year={2024},
      eprint={2403.09093},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}