Official implementation of MixerGANsformer in PyTorch. A novel GAN model which consists of Transformers and MLP-Mixer. Preprint will be published soon.
In this model, generator is the the same structure in TransGAN's generator and the discriminator is from MLP-Mixer. The goal is to create an strong GAN model without convolutions and show that the MLP-Mixer and Transformers may help to create a strong GAN instead of using pure Transformers or MLP-Mixer in GANs.
Before running train.py
, check whether you have libraries in requirements.txt
! To save your model during training, create ./checkpoint
folder using mkdir checkpoint
.
CIFAR10
python train.py
MIT