Fine tuned BERT-base-uncased (Bidirectional Encoder Representations for Transformers) language model using low rank adaptation for detecting semantic equivalence between two statements. The model was finetuned on the GLUE MRPC dataset from Huggingface using additional 1% of total model parameters.