/bert-vs-t5-for-question-answering

huggingface-based implementation of an open question answering model trained on the newsqa dataset.

Primary LanguagePythonMIT LicenseMIT

Bert vs T5 for question answering

Comparison of the performances of two NLP models (BERT, extractive and T5, generative) for question answering.

Installation

Requirements: python 3.6.9

git clone https://github.com/nunziati/bert-vs-t5-for-question-answering/
cd bert-vs-t5-for-question-answering
python -m pip install requirements.txt