A Pytorch Implementation of the paper "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding".
I used Transformer Encoder from my Transformer Implementation.
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- Attention Is All You Need, Vaswani et al.
- BERT, Google Research
- BERT-pytorch, Junseong Kim
- Transformers, Huggingface
- This repository is developed and maintained by Yonghee Cheon (yonghee.cheon@gmail.com).
- It can be found here: https://github.com/yonghee12/bert_torch
- Linkedin Profile: https://www.linkedin.com/in/yonghee-cheon-7b90b116a/