/nlp-text_summarization

Text Summarization using bi-directional LSTMs with Attention

Primary LanguageJupyter Notebook

Abstractive Text Summarization applied for transcripted brazilian Senators speeches

In this project, Recurrent Neural Networks (RNNs) are implemented in an encoder-decoder architecture with the goal to capture context and meaning of a large text with extensive vocabulary and summarize it, using its own words. Details of the text preprocessing strategies and model implementation can be found in the Google Colab code Open In Colab and in the Report.

The code was developed in 2020/2 as the Final Project of Text Mining Course at BrasĂ­lia University.