SinclairCoder/Instruction-Tuning-Papers

Paper Suggestion for "Dynosaur: A Dynamic Growth Paradigm for Instruction-Tuning Data Curation"

Closed this issue · 2 comments

Thank you for maintaining such a comprehensive survey for instruction tuning!

Recently, we publish a paper, Dynosaur: A Dynamic Growth Paradigm for Instruction-Tuning Data Curation (Arxiv: https://arxiv.org/abs/2305.14327, Project page: https://dynosaur-it.github.io/). We introduce Dynosaur, a dynamic growth paradigm for instruction-tuning data curation. Built upon metadata of NLP datasets, we generate multiple task instructions applicable to various NLP datasets and determine the relevant data fields for constructing instruction-tuning data with LLMs.

Would you like to also add our Dynosaur paper into the list? Thank you so much!

Of course! It is a solid work! I have added it.

Thank you so much!!