/tinyroberta-distillation-qa-es

Code for "Language Model Knowledge Distillation for Efficient Question Answering in Spanish" (ICLR 2024 Tiny Papers)

Primary LanguagePython

No issues in this repository yet.