In this project we address Word-in-Context (WiC) disambiguation as a binary classification task, i.e. given two sentences we want to determine whether the indicated polysemous target words have the same meaning or not. We experimented with a word-level approach (MLP + ReLU) and a sequence encoding one (LSTMs), both on top of GloVe static word embeddings.
For further information, you can read the detailed report or take a look at the presentation slides (pages 2-9).
This project has been developed during the A.Y. 2020-2021 for the Natural Language Processing course @ Sapienza University of Rome.
- Word Sense Disambiguation (WSD) for WiC disambiguation, experimenting with BERT feature-based and fine-tuning approaches (GlossBERT)
- Aspect-Based Sentiment Analysis (ABSA) using different setups based on 2 stacked BiLSTMs and Attention layers; leveraging PoS, GloVe and BERT (frozen) embeddings