/nlp-word-in-context-disambiguation

Word-in-Context (WiC) disambiguation experimenting with a word-level approach (MLP + ReLU) and a sequence encoding one (LSTMs), on top of GloVe embeddings

Primary LanguageJupyter Notebook

Word-in-Context (WiC) disambiguation

In this project we address Word-in-Context (WiC) disambiguation as a binary classification task, i.e. given two sentences we want to determine whether the indicated polysemous target words have the same meaning or not. We experimented with a word-level approach (MLP + ReLU) and a sequence encoding one (LSTMs), both on top of GloVe static word embeddings.

For further information, you can read the detailed report or take a look at the presentation slides (pages 2-9).

This project has been developed during the A.Y. 2020-2021 for the Natural Language Processing course @ Sapienza University of Rome.

Related projects

Authors