Next word prediction using LSTM and BERT
Aparna-Sakshi opened this issue · 3 comments
Aparna-Sakshi commented
Description
Given a sequence of words, next k words will be predicted using
- character-based LSTM models
- word-based LSTM models
- Transformer-based language model (i.e)BERT
After which I'll compare the performance of these models.
I am a GSSOC-21 participant, kindly assign this issue to me.
welcome commented
Hello there!👋 Welcome to the project!🚀⚡
Thank you and congrats🎉 for opening your very first issue in this project. Please wait for sometime and make sure not to start working on the issue, unless you get assigned to it.😄
Rishikeshrajrxl commented
@Aparna-Sakshi go ahead.
Aparna-Sakshi commented
I have made a pull request. Here is the link to it: #94
@Rishikeshrajrxl