Jointly train Question Answering and Multiple Choice
tmquan opened this issue ยท 1 comments
๐ Feature: Multi task NLP model
For such an IELTS exam paper, there are several types of questions such as Question Answering and Multiple Choice.
The current implementation of lightning_transformer does well for a single task but I wonder whether a case to jointly train 2 tasks at the same time? Because the context will be shared during two tasks, therefore sharing the encoder will be beneficial.
Alternatives
I found a reference to do this directly on huggingface transformer but dont know how to structure it to adapt with lightning transformers.
https://colab.research.google.com/github/zphang/zphang.github.io/blob/master/files/notebooks/Multi_task_Training_with_Transformers_NLP.ipynb#scrollTo=xW8bnTgCsx5c
Hi @tmquan, from the question, I guess it is not directly available as of now, in lightning-transformer
, but from the colab notebook that you shared and the code base of lightning-transformer
, I think, the following approach could help:
- You can inherit the class of
Seq2SeqTransformer
as mentioned here, and modify the arguments during initialization and common step. - Modifying this step with some pre-processing, model defining, metrics calculation, and post-processing, would make it easy to jointly train Question Answering and Multiple Choice (Multi-task NLP model) tasks.
I hope this helps.