/BenchmarkPDA

BenchmarkPDA is a Pytorch framework that contains dataset and algorithms for partial domain adaptation.

Primary LanguagePythonMIT LicenseMIT

BenchmarkPDA

BenchmarkPDA is a Pytorch framework to benchmark Partial Domain Adaptation methods with different model selection strategies.

Prerequisites

  • pytorch==1.8.0
  • torchvision==0.9.
  • torchaudio==0.8.0
  • cudatoolkit=11.1
  • pyyaml
  • scikit-learn
  • jupyterlab
  • prettytable
  • ipywidgets
  • tqdm
  • pandas
  • opencv
  • pot
  • cvxpy

Datasets

The currently supports the following datasets:

Place them in the folder datasets. Make sure to place the image path lists in the image_list in the respective dataset folder.

Methods

The currently available methods are:

Step 1 - Hyper-Parameter Grid Search

To run the hyper-parameter search for the Office-Home dataset use

python hp_search_train_val.py --method METHOD --dset office-home --source_domain Art --target_domain Clipart

where METHOD should be one of the following: source_only_plus, pada, ba3us, ar, jumbot, mpot.

To run the hyper-parameter search for the VisDA dataset use

python hp_search_train_val.py --method METHOD --dset visda --source_domain train --target_domain validation

where METHOD should be one of the following: source_only_plus, pada, ba3us, ar, jumbot, mpot. The train domain are the synthetic images, while the validation domain corresponds to the real images.

Step 2 - Select Best Hyper-Parameters

We select the best hyper-parameters using the Jupyter Notebook Step 2 - Select Hyper-Parameter Grid Search.ipynb. It includes code to recreate Tables 4 and 10.

Step 3 - Train Best Hyper-Paramters

To train the models with the different hyper-parameters chosen use

python train_hp_chosen.py --dset DATASET

where DATASET should be office-home or visda.

Step 4 - Collect results and generate tables

We gather all the results using the Jupyter Notebook Step 4 - Collect Results.ipynb. It includes code to generate Tables 1, 5, 6, 7 and 12.

License

This source code is released under the MIT license, included here.