LINs-lab/ttab

Requests on pretrain code and experimental settings for other datasets

JoonHo-Jang opened this issue · 2 comments

Dear authors,

I have requests on several things I already sent an email to author.

  1. Could you release the pretraining code for the datasets other than CIFAR10?
    • It would be pleasure if you provide the pretrain code and its corresponding experimental settings such as epochs, architecture, and learning rate on Office-Home or PACS.
  2. Could you release 'parameter.py' for each model/dataset in terms of standard settings ?
    • the standard setting I said refers to the setting utilized in Table 2, including the pre-train codes (and settings) for each dataset.

My requests are just for reproducing the results in Table 2 of your paper.

I hope this requests do not disturb you much.

Thank you.
Best,

Hey JoonHo,

As per your request, I have released an improved pretraining script that can support pretraining on all of benchmark datasets appeared in our paper except ImageNet. You can check it out in \pretrain.

I am unable to access the server I stored experiment logs before due to an unexpected vpn issue, so this time I cannot release the parameters you requested. I will update it asap once the vpn issue is resolved.

Hey JoonHo,

We have released a set of experimental setups in exps we used to create Table 2 in the paper. In fact, we mainly care about the part of common hyperparameters which are relevant to each adaptation process, such as lr and n_train_steps as illustrated in our paper.

These setup scripts need to work with a experimental pipeline based on tmux. We have also provided this pipeline in our codebase and you can check out the updated README documentation to see how to use it.

Hope this information helps!