/GFPGAN-

GFPGAN aims at developing Practical Algorithms for Real-world Face Restoration.

Primary LanguagePythonOtherNOASSERTION

GFPGAN (CVPR 2021)

download PyPI Open issue LICENSE python lint Publish-pip

  1. Colab Demo for GFPGAN google colab logo; (Another Colab Demo for the original paper model)
  2. We provide a clean version of GFPGAN, which can run without CUDA extensions. So that it can run in Windows or on CPU mode.

GFPGAN aims at developing Practical Algorithm for Real-world Face Restoration.
It leverages rich and diverse priors encapsulated in a pretrained face GAN (e.g., StyleGAN2) for blind face restoration.

🚩 Updates

  • βœ… Support enhancing non-face regions (background) with Real-ESRGAN.
  • βœ… We provide a clean version of GFPGAN, which does not require CUDA extensions.
  • βœ… We provide an updated model without colorizing faces.

If GFPGAN is helpful in your photos/projects, please help to ⭐ this repo. Thanks😊
Other recommended projects:   ▢️ Real-ESRGAN   ▢️ BasicSR   ▢️ facexlib


πŸ“– GFP-GAN: Towards Real-World Blind Face Restoration with Generative Facial Prior

[Paper]   [Project Page]   [Demo]
Xintao Wang, Yu Li, Honglun Zhang, Ying Shan
Applied Research Center (ARC), Tencent PCG


πŸ”§ Dependencies and Installation

Installation

We now provide a clean version of GFPGAN, which does not require customized CUDA extensions.
If you want want to use the original model in our paper, please see PaperModel.md for installation.

  1. Clone repo

    git clone https://github.com/TencentARC/GFPGAN.git
    cd GFPGAN
  2. Install dependent packages

    # Install basicsr - https://github.com/xinntao/BasicSR
    # We use BasicSR for both training and inference
    pip install basicsr
    
    # Install facexlib - https://github.com/xinntao/facexlib
    # We use face detection and face restoration helper in the facexlib package
    pip install facexlib
    
    pip install -r requirements.txt
    python setup.py develop
    
    # If you want to enhance the background (non-face) regions with Real-ESRGAN,
    # you also need to install the realesrgan package
    pip install realesrgan

⚑ Quick Inference

Download pre-trained models: GFPGANCleanv1-NoCE-C2.pth

wget https://github.com/TencentARC/GFPGAN/releases/download/v0.2.0/GFPGANCleanv1-NoCE-C2.pth -P experiments/pretrained_models

Inference!

python inference_gfpgan.py --upscale_factor 2 --test_path inputs/whole_imgs --save_root results

🏰 Model Zoo

  • GFPGANCleanv1-NoCE-C2.pth: No colorization; no CUDA extensions are required. It is still in training. Trained with more data with pre-processing.
  • GFPGANv1.pth: The paper model, with colorization.

πŸ’» Training

We provide the training codes for GFPGAN (used in our paper).
You could improve it according to your own needs.

Tips

  1. More high quality faces can improve the restoration quality.
  2. You may need to perform some pre-processing, such as beauty makeup.

Procedures

(You can try a simple version ( options/train_gfpgan_v1_simple.yml) that does not require face component landmarks.)

  1. Dataset preparation: FFHQ

  2. Download pre-trained models and other data. Put them in the experiments/pretrained_models folder.

    1. Pretrained StyleGAN2 model: StyleGAN2_512_Cmul1_FFHQ_B12G4_scratch_800k.pth
    2. Component locations of FFHQ: FFHQ_eye_mouth_landmarks_512.pth
    3. A simple ArcFace model: arcface_resnet18.pth
  3. Modify the configuration file options/train_gfpgan_v1.yml accordingly.

  4. Training

python -m torch.distributed.launch --nproc_per_node=4 --master_port=22021 gfpgan/train.py -opt options/train_gfpgan_v1.yml --launcher pytorch

πŸ“œ License and Acknowledgement

GFPGAN is released under Apache License Version 2.0.

BibTeX

@InProceedings{wang2021gfpgan,
    author = {Xintao Wang and Yu Li and Honglun Zhang and Ying Shan},
    title = {Towards Real-World Blind Face Restoration with Generative Facial Prior},
    booktitle={The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
    year = {2021}
}

πŸ“§ Contact

If you have any question, please email xintao.wang@outlook.com or xintaowang@tencent.com.