/Law-OMNI-BERT-Project

Directly applying advancements in transfer learning from BERT results in poor accuracy in domain-specific areas like law because of a word distribution shift from general domain corpora to domain-specific corpora. In our project, we will demonstrate how the pre-trained language model BERT can be adapted to additional domains, such as contract law or court judgments.

Primary LanguageCSS

Stargazers