/transferlearning

Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习

Primary LanguagePythonMIT LicenseMIT

Contributors Forks Stargazers Issues


Transfer Leanring

Everything about Transfer Learning. 迁移学习.

PapersTutorialsResearch areasTheorySurveyCodeDataset & benchmark

ThesisScholarsContestsJournal/conferenceApplicationsOthersContributing

Widely used by top conferences and journals:

@Misc{transferlearning.xyz,
howpublished = {\url{http://transferlearning.xyz}},   
title = {Everything about Transfer Learning and Domain Adapation},  
author = {Wang, Jindong and others}  
}  

Awesome MIT License LICENSE 996.icu

Related Codes:


NOTE: You can directly open the code in Gihub Codespaces on the web to run them without downloading! Also, try github.dev.

0.Papers (论文)

Awesome transfer learning papers (迁移学习文章汇总)

  • Paperweekly: A website to recommend and read paper notes

Latest papers:

Updated at 2023-11-21:

  • A2XP: Towards Private Domain Generalization [arxiv]

    • Private domain generalization 隐私保护的domain generalization
  • Layer-wise Auto-Weighting for Non-Stationary Test-Time Adaptation [arxiv]

    • Auto-weighting for test-time adaptation 自动权重的TTA
  • Domain Generalization by Learning from Privileged Medical Imaging Information [arxiv]

    • Domain generalizaiton by learning from privileged medical imageing inforamtion

Updated at 2023-11-08:

  • SSL-DG: Rethinking and Fusing Semi-supervised Learning and Domain Generalization in Medical Image Segmentation [arxiv]

    • Semi-supervised learning + domain generalization 把半监督和领域泛化结合在一起
  • WACV'24 Learning Class and Domain Augmentations for Single-Source Open-Domain Generalization [arxiv]

    • Class and domain augmentation for single-source open-domain DG 结合类和domain增强做单源DG
  • Proposal-Level Unsupervised Domain Adaptation for Open World Unbiased Detector [arxiv]

    • Proposal-level unsupervised domain adaptation
  • Robust Fine-Tuning of Vision-Language Models for Domain Generalization [arxiv]

    • Robust fine-tuning for domain generalization 用于领域泛化的鲁棒微调

Updated at 2023-11-06:

  • NeurIPS 2023 Distilling Out-of-Distribution Robustness from Vision-Language Foundation Models [arxiv]

    • Distill OOD robustness from vision-language foundational models 从VLM模型中蒸馏出OOD鲁棒性
  • UbiComp 2024 Optimization-Free Test-Time Adaptation for Cross-Person Activity Recognition [arxiv]

    • Test-time adaptation for activity recognition 测试时adaptation用于行为识别

1.Introduction and Tutorials (简介与教程)

Want to quickly learn transfer learning?想尽快入门迁移学习?看下面的教程。


2.Transfer Learning Areas and Papers (研究领域与相关论文)


3.Theory and Survey (理论与综述)

Here are some articles on transfer learning theory and survey.

Survey (综述文章):

Theory (理论文章):


4.Code (代码)

Unified codebases for:

More: see HERE and HERE for an instant run using Google's Colab.


5.Transfer Learning Scholars (著名学者)

Here are some transfer learning scholars and labs.

全部列表以及代表工作性见这里

Please note that this list is far not complete. A full list can be seen in here. Transfer learning is an active field. If you are aware of some scholars, please add them here.


6.Transfer Learning Thesis (硕博士论文)

Here are some popular thesis on transfer learning.

这里, 提取码:txyz。


7.Datasets and Benchmarks (数据集与评测结果)

Please see HERE for the popular transfer learning datasets and benchmark results.

这里整理了常用的公开数据集和一些已发表的文章在这些数据集上的实验结果。


8.Transfer Learning Challenges (迁移学习比赛)


Journals and Conferences

See here for a full list of related journals and conferences.


Applications (迁移学习应用)

See HERE for transfer learning applications.

迁移学习应用请见这里


Other Resources (其他资源)


Contributing (欢迎参与贡献)

If you are interested in contributing, please refer to HERE for instructions in contribution.


Copyright notice

[Notes]This Github repo can be used by following the corresponding licenses. I want to emphasis that it may contain some PDFs or thesis, which were downloaded by me and can only be used for academic purposes. The copyrights of these materials are owned by corresponding publishers or organizations. All this are for better adademic research. If any of the authors or publishers have concerns, please contact me to delete or replace them.