/tcbert

Bert for Topcoder

Primary LanguagePythonApache License 2.0Apache-2.0

BERT for Topcoder

Bert for Topcoder

This version of Bert is for the use of Marathon Match in Topcoder

The source came from the following repository: Autocoding Injury Narratives with BERT

Some notes excepted from Bert README as follows:

What license is this library released under?

All code and models are released under the Apache 2.0 license. See the LICENSE file for more information.

How do I cite BERT?

For now, cite the Arxiv paper:

@article{devlin2018bert,
  title={BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding},
  author={Devlin, Jacob and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina},
  journal={arXiv preprint arXiv:1810.04805},
  year={2018}
}

If we submit the paper to a conference or journal, we will update the BibTeX.

Disclaimer

This is not an official Google product.

Contact information

For help or issues using BERT, please submit a GitHub issue.

For personal communication related to BERT, please contact Jacob Devlin (jacobdevlin@google.com), Ming-Wei Chang (mingweichang@google.com), or Kenton Lee (kentonl@google.com).