bert-fine-tuning-for-qa

There are 2 repositories under bert-fine-tuning-for-qa topic.

  • Distil-Bert

    DistilBERT, a distilled version of BERT (Bidirectional Encoder Representations from Transformers). DistilBERT is known for its efficiency and reduced computational requirements while retaining significant language understanding capabilities.

    Language:Python
  • Q-A-Model-Bert-Base-Cased

    The Question Answering Model presented in this repository is built on BERT (Bidirectional Encoder Representations from Transformers), a state-of-the-art transformer-based architecture tailored for natural language processing tasks.

    Language:Python