Transfer Learning & Multi-task Learning Paper

2016

  • A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks. K.Hashimoto et al. NIPS 2016 Continual Learning and Deep Networks Workshop. [pdf] (Multi-task in NLP, the learning is stacked consecutively by many dependent tasks in a hierarchical structure)
  • Cross-stich Networks for Multi-task Learning. I.Misra et al. arXiv. [pdf] (Using a sharing Cross-stich unit to combine the activations of multi-task networks)
  • Deep Neural Networks with Massive Learned Knowledge. Z.Hu et al. EMNLP. [pdf] (NLP, Regularize a ConvNet by training jointly a Teacher-Student model with additional linguistic knowledge expert)

2015

  • Distilling the knowledge in a neural network. G.Hinton et al. arXiv. [pdf] (Transfer by Distilling an Ensemble of Neural Network)

2014

  • How transferable are features in deep neural networks. J.Yosinski et al. NIPS [pdf] (How to transfer optimally in the Deep ConvNet case)

2012

  • Deep Learning of Representations for Unsupervised and Transfer Learning. Y. engio. ICML Unsupervised and Transfer Learning. [pdf] (A Tutorial specifically for Deep Learning)

2010

  • A Survey on Transfer Learning. SJ.Pan, Q.Yang. IEEE Transactions on knowledge and data engineering. [pdf] (A good general overview of many kinds of transfer learning, not related especially to any kind of machine learning methods)

2008

  • A unified architecture for natural language processing: Deep neural networks with multitask learning. R.Collobert, J.Weston. Proceedings of the 25th international conference on Machine learning. [pdf] (Multi-task in NLP, the learning is on independent tasks but the knowledge is transfered via embedding vectors)