bert-fine-tuning-for-qa

There are 2 repositories under bert-fine-tuning-for-qa topic.

  • amiruzzaman1/Q-A-Model-Bert-Base-Cased

    The Question Answering Model presented in this repository is built on BERT (Bidirectional Encoder Representations from Transformers), a state-of-the-art transformer-based architecture tailored for natural language processing tasks.

    Language:Python0200
  • amiruzzaman1/Distil-Bert

    DistilBERT, a distilled version of BERT (Bidirectional Encoder Representations from Transformers). DistilBERT is known for its efficiency and reduced computational requirements while retaining significant language understanding capabilities.

    Language:Python20