Clone this repository:
git clone https://github.com/facialGAN/facialGAN
We have developed an interactive facial toolbox that allows an easy manipulation of both styles and attributes. The user only needs to select the source image and the reference that wants to apply. Then, our model will produce the desired combination. On top of that, the user also can alter the default mask and the effects will be displayed on the output.
Video tutorial -> https://www.youtube.com/watch?v=N4jRSNKPB0s 🎬 🎬
cd FacialGAN/Toolbox
mkdir checkpoints
Download Pretrained Model (facial_checkpoint.ckpt) and save it in checkpoints folder.
Create enviroment with the dependencies:
On Windows 10:
conda env create -f facialGAN_env_windows.yml
conda activate facialGAN
On 19.04 Ubuntu:
conda env create -f facialGAN_env_ubuntu.yml
conda activate facialGAN
Run toolbox:
python demo.py
cd FacialGAN/Training
mkdir data
bash download.sh celeba-hq-dataset
Download CelebAMask-HQ and save it in data folder.
bash download.sh celebaMask-hq-dataset
Note: the script might need ~30 min
Download Pretrained Segemtation Model (weights_seg.pth) and save it in core folder.
bash train.sh
We acknowledge the official code StarGANv2 & MaskGAN
@misc{durall2021facialgan,
title={FacialGAN: Style Transfer and Attribute Manipulation on Synthetic Faces},
author={Ricard Durall and Jireh Jam and Dominik Strassel and Moi Hoon Yap and Janis Keuper},
year={2021},
eprint={2110.09425},
archivePrefix={arXiv},
primaryClass={cs.CV}
}