This repo is official TensorFlow implementation MASSRL.
[Blog Post]("Coming Soon")
This repo contains the source code for the MASSRL
multi-Augmentation Strategies in Tensorflow models effortless and less error-prone.
- Installation
- Visualization
MASSRL
Multi-Augmentation Strategies - Configure Self-Supervised Pretraining
- Contributing
pip or conda installs these dependents in your local machine
- tensorflow==2.7.0, tensorflow-addons==0.15.0, tensorflow-datasets==4.4.0, tensorflow-estimator==2.7.0
- tqdm
- wandb
- imgaug
Visualization Multi-Augmentation Strategies on Google-Colab Notebook: https://colab.research.google.com/drive/1fquGOr_psJfDXxOmdFVkfrbedGfi1t-X?usp=sharing
Note the Visualization Augmentation do not need to be trained --- we are only Visualize Image after apply different Augmentation transformations.
However, you need to make sure that the dataset
is appropriately passed down to the constructor of all submodules.
If you want to see this happen, please upvote [this Repo issue]
This implementation supports Single-gpu, Multi-GPUs training.
To do self-superivsed pre-training of a ResNet-50 model on ImageNet in an (1-8)-gpus following Three Stesp:
**1.Training Hyperparaneters Configures**:
- you can change training hyperparameters setting (Dataset paths, All other training hyperperameters) base on
config/non_contrast_config_v1.py as Reference configure
- Consider you GPUs memmory >= 12G ResNet50 --> Recommend training on 4-> 8 GPUs.
**2.Execute MASSRL With 3 Augmentations Strategies SimCRL'Augmentation Pipeline, RandAug, AutoAugment**:
-Nevigate to this directory
self_supervised_learning_frameworks/none_contrastive_framework/run_MASSRL.py
- Execute the 🏃♀️ file.
python run_MASSRL.py
Note: for 8-gpus training, we recommend following the linear lr scaling recipe: --lr 0.2 --batch-size 128
. Other Hyperparameters can set default.
for 1-gpu training, we recommend following the linear lr scaling recipe: --lr 0.3 --batch-size 256
. Other Hyperparameters can set default.
Note: Public ImageNet dataset is implement in this work, if you have your own dataset you can change the path corresponding.
Downloading ImageNet-1K dataset (https://www.image-net.org/download.php).
Update Soon
Update Soon
Update Soon
Update Soon
Update Soon
Awesome! Thank You for being a part this project > > Before you start to contribute for this repository, please quick go through Guidelines. Update Soon
- MASSRL.Pytorch-lightning: A Pytorch-Lightning official implementation.
@Article{TranMASSRL,
author = {Van-Nhiem Tran, Chi-En Huang, Shen-Hsuan Liu, Kai-Lin Yang, Timothy Ko, Yung-Hui Li},
title = {Multi-Augmentation Strategies Disentangle represenation learning Self-Supervised},
journal = {https://arxiv.org/abs/2205.11772},
year = {2022},
}