Bharath-K3/Next-Word-Prediction-with-NLP-and-Deep-Learning

Thanks for this ... but ....

Opened this issue · 0 comments

The model fails to come up with multiple suggestions if the input is "to" (it will always return "cover"). This is because before the tokenization, the string was processed to make sure there's only one word exists. As a side-effect, it captured only "to cover" which is incorrect. However "to" has a number of possible next word.