/transferlearning

Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习

Primary LanguagePythonMIT LicenseMIT

Contributors Forks Stargazers Issues


Transfer Leanring

Everything about Transfer Learning. 迁移学习.

PapersTutorialsResearch areasTheorySurveyCodeDataset & benchmark

ThesisScholarsContestsJournal/conferenceApplicationsOthersContributing

Widely used by top conferences and journals:

@Misc{transferlearning.xyz,
howpublished = {\url{http://transferlearning.xyz}},   
title = {Everything about Transfer Learning and Domain Adapation},  
author = {Wang, Jindong and others}  
}  

Awesome MIT License LICENSE 996.icu

Related Codes:


NOTE: You can directly open the code in Gihub Codespaces on the web to run them without downloading! Also, try github.dev.

0.Papers (论文)

Awesome transfer learning papers (迁移学习文章汇总)

  • Paperweekly: A website to recommend and read paper notes

Latest papers:

Updated at 2024-02-26:

  • Unsupervised Domain Adaptation within Deep Foundation Latent Spaces [arxiv]
    • Domain adaptation using foundation models

Updated at 2024-01-26:

  • Facing the Elephant in the Room: Visual Prompt Tuning or Full Finetuning? [arxiv]

    • A comparison between visual prompt tuning and full finetuning 比较prompt tuning和全finetune
  • Out-of-Distribution Detection & Applications With Ablated Learned Temperature Energy [arxiv]

    • OOD detection for ablated learned temperature energy
  • LanDA: Language-Guided Multi-Source Domain Adaptation [arxiv]

    • Language guided multi-source DA 在多源域自适应中使用语言指导
  • AdaEmbed: Semi-supervised Domain Adaptation in the Embedding Space [arxiv]

    • Semi-spuervised domain adaptation in the embedding space 在嵌入空间中进行半监督域自适应
  • Inter-Domain Mixup for Semi-Supervised Domain Adaptation [arxiv]

    • Inter-domain mixup for semi-supervised domain adaptation 跨领域mixup用于半监督域自适应
  • Source-Free and Image-Only Unsupervised Domain Adaptation for Category Level Object Pose Estimation [arxiv]

    • Source-free and image-only unsupervised domain adaptation

Updated at 2024-01-17:

  • ICLR'24 spotlight Understanding and Mitigating the Label Noise in Pre-training on Downstream Tasks [arxiv]

    • A new research direction of transfer learning in the era of foundation models 大模型时代一个新研究方向:研究预训练数据的噪声对下游任务影响
  • ICLR'24 Supervised Knowledge Makes Large Language Models Better In-context Learners [arxiv]

    • Small models help large language models for better OOD 用小模型帮助大模型进行更好的OOD

Updated at 2024-01-16:

  • NeurIPS'23 Geodesic Multi-Modal Mixup for Robust Fine-Tuning [paper]

    • Geodesic mixup for robust fine-tuning
  • NeurIPS'23 Parameter and Computation Efficient Transfer Learning for Vision-Language Pre-trained Models [paper]

    • Parameter and computation efficient transfer learning by reinforcement learning
  • NeurIPS'23 Test-Time Distribution Normalization for Contrastively Learned Visual-language Models [paper]

    • Test-time distribution normalization for contrastively learned VLM
  • NeurIPS'23 A Closer Look at the Robustness of Contrastive Language-Image Pre-Training (CLIP) [paper]

    • A fine-gained analysis of CLIP robustness
  • NeurIPS'23 When Visual Prompt Tuning Meets Source-Free Domain Adaptive Semantic Segmentation [paper]

    • Source-free domain adaptation using visual prompt tuning

Updated at 2024-01-08:

  • NeurIPS'23 CODA: Generalizing to Open and Unseen Domains with Compaction and Disambiguation [arxiv]
    • Open set domain generalization using extra classes

Updated at 2024-01-05:

  • CPAL'24 FIXED: Frustratingly Easy Domain Generalization with Mixup [arxiv]

    • Easy domain generalization with mixup
  • SDM'24 Towards Optimization and Model Selection for Domain Generalization: A Mixup-guided Solution [arxiv]

    • Optimization and model selection for domain generalization
  • Leveraging SAM for Single-Source Domain Generalization in Medical Image Segmentation [arxiv]

    • SAM for single-source domain generalization
  • Multi-Source Domain Adaptation with Transformer-based Feature Generation for Subject-Independent EEG-based Emotion Recognition [arxiv]

    • Multi-source DA with Transformer-based feature generation

1.Introduction and Tutorials (简介与教程)

Want to quickly learn transfer learning?想尽快入门迁移学习?看下面的教程。


2.Transfer Learning Areas and Papers (研究领域与相关论文)


3.Theory and Survey (理论与综述)

Here are some articles on transfer learning theory and survey.

Survey (综述文章):

Theory (理论文章):


4.Code (代码)

Unified codebases for:

More: see HERE and HERE for an instant run using Google's Colab.


5.Transfer Learning Scholars (著名学者)

Here are some transfer learning scholars and labs.

全部列表以及代表工作性见这里

Please note that this list is far not complete. A full list can be seen in here. Transfer learning is an active field. If you are aware of some scholars, please add them here.


6.Transfer Learning Thesis (硕博士论文)

Here are some popular thesis on transfer learning.

这里, 提取码:txyz。


7.Datasets and Benchmarks (数据集与评测结果)

Please see HERE for the popular transfer learning datasets and benchmark results.

这里整理了常用的公开数据集和一些已发表的文章在这些数据集上的实验结果。


8.Transfer Learning Challenges (迁移学习比赛)


Journals and Conferences

See here for a full list of related journals and conferences.


Applications (迁移学习应用)

See HERE for transfer learning applications.

迁移学习应用请见这里


Other Resources (其他资源)


Contributing (欢迎参与贡献)

If you are interested in contributing, please refer to HERE for instructions in contribution.


Copyright notice

[Notes]This Github repo can be used by following the corresponding licenses. I want to emphasis that it may contain some PDFs or thesis, which were downloaded by me and can only be used for academic purposes. The copyrights of these materials are owned by corresponding publishers or organizations. All this are for better adademic research. If any of the authors or publishers have concerns, please contact me to delete or replace them.