This repo is for reproducing the experimental results in our paper [JPEG Inspired Deep Learning] submitted at ICLR 2025.
To CIFAR-100 and Fine-grained Tasks using Transformer-Based model, please use this local directory.
For ImageNet training, please use this local directory.
The repo is tested with Python 3.8, CUDA 11.7, and based on the 'requirements.txt' file provided.
python3.8 train_teacher_cifar_JPEG.py \
--model ${mode} --JEPG_learning_rate 0.003 --JEPG_alpha 5.0 \
--JPEG_enable --alpha_fixed --initial_Q_w_sensitivity
Fetch the pretrained models used for driving the senstivity by:
```
sh scripts/fetch_pretrained_cifar100.sh
```
which will download and save the models to save/models
CIFAR-100 (resnet110) Flowers (resnet18) Flowers (efficientformer_l1)
Based on the drived Senstivity, we have followed the following steps to initialize the Q-Tables for JPEG-DL.
def normalize(arr, factor):
if factor == 0:
factor = np.max(arr)
arr = arr/factor
return arr, factor
Y_sens = 1 / Y_sens
CbCr_sens = 2 / (Cb_sens + Cr_sens)
_ , factor = normalize(Y_sens , 0)
factor = factor / q_max
Y_sens , _ = normalize(Y_sens , factor)
CbCr_sens , _ = normalize(CbCr_sens, factor)
CIFAR-100 (resnet110) Flowers (resnet18) Flowers (efficientformer_l1)
-
Test Robustness for the learned model using PGD
python3 robustness_JPEG.py --model ${mode} \ --alpha_fixed --JPEG_enable \ --model_dir ${model_dir}
-
Test Robustness for the standard model using PGD
python3 robustness_JPEG.py --model ${mode} --model_dir "./save/models/${mode}_vanilla/ckpt_epoch_240.pth
This repo is based on the code given in RepDistiller for CIFAR-100 and PyTorch for ImageNet. Also, we use Weight-Selection to produce our results for Transformer-based models.
@inproceedings{
anonymous2024jpeg,
title={JPEG Inspired Deep Learning},
author={Anonymous},
booktitle={Submitted to The Thirteenth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=te2IdORabL},
note={under review}
}