/bert-cosine-sim

Fine-tune BERT to generate sentence embedding for cosine similarity

Primary LanguagePython

bert-cosine-sim

Fine-tune BERT to generate sentence embedding for cosine similarity. Most of the code is copied from huggingface's bert project.

Download data and pre-trained model for fine-tuning

python prerun.py downloads, extracts and saves model and training data (STS-B) in relevant folder, after which you can simply modify hyperparameters in run.sh

Model and Fine-tuning

Add a FC layer + tanh activation on the CLS token to generate sentence embedding (Don't add dropout on this layer). Fine-tune the model on the STS-B dataset by reducing the cosine similarity loss. With a embedding size of 1024, and trained 20 epochs, it can achieve 0.83 Pearson score on the Dev dataset.

 class BertPairSim(BertPreTrainedModel):
    def __init__(self, config, emb_size=1024):
        super(BertPairSim, self).__init__(config)
        self.emb_size = emb_size
        self.bert = BertModel(config)
        self.emb = nn.Linear(config.hidden_size, emb_size)
        self.activation = nn.Tanh()
        self.cos_fn = torch.nn.CosineSimilarity(dim=1, eps=1e-6)
        self.apply(self.init_bert_weights)

    def calcSim(self, emb1, emb2):
        return self.cos_fn(emb1, emb2)
        
    def forward(self, input_ids, attention_mask):
        _, pooled_output = self.bert(input_ids, None, attention_mask,
                                     output_all_encoded_layers=False)
        emb = self.activation(self.emb(pooled_output))
        return emb