/transferlearning

Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习

Primary LanguagePythonMIT LicenseMIT

Contributors Forks Stargazers Issues


Transfer Leanring

Everything about Transfer Learning. 迁移学习.

PapersTutorialsResearch areasTheorySurveyCodeDataset & benchmark

ThesisScholarsContestsJournal/conferenceApplicationsOthersContributing

Widely used by top conferences and journals:

@Misc{transferlearning.xyz,
howpublished = {\url{http://transferlearning.xyz}},   
title = {Everything about Transfer Learning and Domain Adapation},  
author = {Wang, Jindong and others}  
}  

Awesome MIT License LICENSE 996.icu

Related Codes: [USB: unified semi-supervised learning benchmark] | [TorchSSL: a unified SSL library] | [PersonalizedFL: library for personalized federated learning] | [Activity recognition]|[Machine learning]


NOTE: You can directly open the code in Gihub Codespaces on the web to run them without downloading! Also, try github.dev.

0.Papers (论文)

Awesome transfer learning papers (迁移学习文章汇总)

  • Paperweekly: A website to recommend and read paper notes

Latest papers:

Updated at 2022-12-07:

  • TMLR'22 A Unified Survey on Anomaly, Novelty, Open-Set, and Out of-Distribution Detection: Solutions and Future Challenges [openreview]

    • A recent survey on OOD/anomaly detection 一篇最新的关于OOD/anomaly detection的综述
  • NeurIPS'18 A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks [paper]

    • Using class-conditional distribution for OOD detection 使用类条件概率进行OOD检测
  • ICLR'22 Discrete Representations Strengthen Vision Transformer Robustness [arxiv]

    • Embed discrete representation for OOD generalization 在ViT中加入离散表征增强OOD性能

Updated at 2022-12-02:

  • CONDA: Continual Unsupervised Domain Adaptation Learning in Visual Perception for Self-Driving Cars [arxiv]

    • Continual DA for self-driving cars 连续的domain adaptation用于自动驾驶
  • Finetune like you pretrain: Improved finetuning of zero-shot vision models [arxiv]]

    • Improved fine-tuning of zero-shot models 针对zero-shot model提高fine-tuneing

Updated at 2022-11-25:

  • Robust Mean Teacher for Continual and Gradual Test-Time Adaptation [arxiv]

    • Mean teacher for test-time adaptation 在测试时用mean teacher进行适配
  • Learning to Learn Domain-invariant Parameters for Domain Generalization [[arxiv](Learning to Learn Domain-invariant Parameters for Domain Generalization)]

    • Learning to learn domain-invariant parameters for DG 元学习进行domain generalization
  • HMOE: Hypernetwork-based Mixture of Experts for Domain Generalization [arxiv]

    • Hypernetwork-based ensembling for domain generalization 超网络构成的集成学习用于domain generalization
  • The Evolution of Out-of-Distribution Robustness Throughout Fine-Tuning [arxiv]

    • OOD using fine-tuning 系统总结了基于fine-tuning进行OOD的一些结果

Updated at 2022-11-21:

  • GLUE-X: Evaluating Natural Language Understanding Models from an Out-of-distribution Generalization Perspective [arxiv]

    • OOD for natural language processing evaluation 提出GLUE-X用于OOD在NLP数据上的评估
  • CVPR'22 Delving Deep Into the Generalization of Vision Transformers Under Distribution Shifts [arxiv]

    • Vision transformers generalization under distribution shifts 评估ViT的分布漂移
  • NeurIPS'22 Models Out of Line: A Fourier Lens on Distribution Shift Robustness [arxiv]

    • A fourier lens on distribution shift robustness 通过傅里叶视角来看分布漂移的鲁棒性
  • CVPR'22 Does Robustness on ImageNet Transfer to Downstream Tasks? [arxiv]

    • Does robustness on imagenet transfer lto downstream tasks?

Updated at 2022-11-14:

  • Normalization Perturbation: A Simple Domain Generalization Method for Real-World Domain Shifts [arxiv]

    • Normalization perturbation for domain generalization 通过归一化扰动来进行domain generalization
  • FIXED: Frustraitingly easy domain generalization using Mixup [arxiv]

    • 使用Mixup进行domain generalization
  • Learning to Learn Domain-invariant Parameters for Domain Generalization [arxiv]

    • Learning to learn domain-invariant parameters for domain generalization

Updated at 2022-11-07:

  • NeurIPS'22 Improved Fine-Tuning by Better Leveraging Pre-Training Data [openreview]

    • Using pre-training data for fine-tuning 用预训练数据来做微调
  • NeurIPS'22 Divide and Contrast: Source-free Domain Adaptation via Adaptive Contrastive Learning [openreview]

    • Adaptive contrastive learning for source-free DA 自适应的对比学习用于source-free DA
  • NeurIPS'22 LOG: Active Model Adaptation for Label-Efficient OOD Generalization [openreview]

    • Model adaptation for label-efficient OOD generalization
  • NeurIPS'22 MetaTeacher: Coordinating Multi-Model Domain Adaptation for Medical Image Classification [openreview]

    • Multi-model domain adaptation mor medical image classification 多模型DA用于医疗数据
  • NeurIPS'22 Domain Adaptation under Open Set Label Shift [openreview]

    • Domain adaptation under open set label shift 在开放集的label shift中的DA

Updated at 2022-11-03:

  • NeurIPS'22 Domain Generalization without Excess Empirical Risk [openreview]

    • Domain generalization without excess empirical risk
  • NeurIPS'22 FedSR: A Simple and Effective Domain Generalization Method for Federated Learning [openreview]

    • FedSR for federated learning domain generalization 用于联邦学习的domain generalization
  • NeurIPS'22 Probable Domain Generalization via Quantile Risk Minimization [openreview]

    • Domain generalization with quantile risk minimization 用quantile风险最小化的domain generalization
  • NeurIPS'22 Beyond Not-Forgetting: Continual Learning with Backward Knowledge Transfer [arxiv]

    • Continual learning with backward knowledge transfer 反向知识迁移的持续学习
  • NeurIPS'22 Test Time Adaptation via Conjugate Pseudo-labels [openreview]

    • Test-time adaptation with conjugate pseudo-labels 用伪标签进行测试时adaptation
  • NeurIPS'22 Your Out-of-Distribution Detection Method is Not Robust! [openreview]

    • OOD models are not robust 分布外泛化模型不够鲁棒

1.Introduction and Tutorials (简介与教程)

Want to quickly learn transfer learning?想尽快入门迁移学习?看下面的教程。


2.Transfer Learning Areas and Papers (研究领域与相关论文)


3.Theory and Survey (理论与综述)

Here are some articles on transfer learning theory and survey.

Survey (综述文章):

Theory (理论文章):


4.Code (代码)

Unified codebases for:

More: see HERE and HERE for an instant run using Google's Colab.


5.Transfer Learning Scholars (著名学者)

Here are some transfer learning scholars and labs.

全部列表以及代表工作性见这里

Please note that this list is far not complete. A full list can be seen in here. Transfer learning is an active field. If you are aware of some scholars, please add them here.


6.Transfer Learning Thesis (硕博士论文)

Here are some popular thesis on transfer learning.

这里, 提取码:txyz。


7.Datasets and Benchmarks (数据集与评测结果)

Please see HERE for the popular transfer learning datasets and benchmark results.

这里整理了常用的公开数据集和一些已发表的文章在这些数据集上的实验结果。


8.Transfer Learning Challenges (迁移学习比赛)


Journals and Conferences

See here for a full list of related journals and conferences.


Applications (迁移学习应用)

See HERE for transfer learning applications.

迁移学习应用请见这里


Other Resources (其他资源)


Contributing (欢迎参与贡献)

If you are interested in contributing, please refer to HERE for instructions in contribution.


Copyright notice

[Notes]This Github repo can be used by following the corresponding licenses. I want to emphasis that it may contain some PDFs or thesis, which were downloaded by me and can only be used for academic purposes. The copyrights of these materials are owned by corresponding publishers or organizations. All this are for better adademic research. If any of the authors or publishers have concerns, please contact me to delete or replace them.