/Layer-Variational-Analysis

Layer Variational Analysis for Transfer Learning

Primary LanguageJupyter NotebookMIT LicenseMIT

Layer Variation Analysis for Transfer Learning

We introduce a novel finetuning method Layer Variation Analysis (LVA) for transfer learning. Three domain adaptation experiments are demonstrated as follows:

Experiments

  • [Exp. 1] Time Series Regression

  • [Exp. 2] Speech Enhancement (denoise)

  • [Exp. 3] Super Resolution (image deblur)

Datasets & Preprocessing

  • [Exp. 1] Requires no dataset
  • [Exp. 2] Download DNS-Challenge (or here) and use /Exp2/data_preprocessing/ for preprocessing.
  • [Exp. 3] Download CUFED (or here) and use /Exp3/preprocessing_SR_images/ for preprocessing.

Pretrain on source domain

  • [Exp. 1] pretraining.py
  • [Exp. 2] SE_pretraining.py
  • [Exp. 3] SRCNN_pretraining.py

Transfer Learning to target domain & Compare finetune models

  • [Exp. 1] GD_finetune_1layer.py & GD_vs_LVA_1layer.py
  • [Exp. 2] SE_finetuning_and_comparison.py
  • [Exp. 3] SRCNN_GD_finetuning.py & SRCNN_LVA_comparisons.py

Prerequisites

Hardware

  • NVIDIA GPU with CUDA 11.0+