AttentionGAN Training/Testing
Download a dataset using the previous script (e.g., horse2zebra).
First Header | Second Header |
---|---|
Hardware | CPU : Intel i7-10875H |
RAM : 16G | |
GPU : NVIDIA RTX 2070 Super 8G | |
Epoch | 49 |
Learning rate | 2e-4 |
Test the Pretrained Model with pretrained file
Test the Trained Model python test.py --dataroot ./datasets/selfie2anime/ --name selfie2anime_attentiongan --model attention_gan --dataset_mode unaligned --norm instance --phase test --no_dropout --load_size 256 --crop_size 256 --batch_size 1 --gpu_ids 0 --num_test 5000 --epoch latest
@article{tang2021attentiongan, title={AttentionGAN: Unpaired Image-to-Image Translation using Attention-Guided Generative Adversarial Networks}, author={Tang, Hao and Liu, Hong and Xu, Dan and Torr, Philip HS and Sebe, Nicu}, journal={IEEE Transactions on Neural Networks and Learning Systems (TNNLS)}, year={2021}
}
@inproceedings{tang2019attention, title={Attention-Guided Generative Adversarial Networks for Unsupervised Image-to-Image Translation}, author={Tang, Hao and Xu, Dan and Sebe, Nicu and Yan, Yan}, booktitle={International Joint Conference on Neural Networks (IJCNN)}, year={2019} }
Some outputs are given below
Reference :