/Text-Generation-A-New-Episode-Of-Friends

This paper discusses the implementation of a Long Short Term Memory Neural Network for the purpose of text generation. Specifically two distinct tokenization approaches will be proposed: word level and character level encoding. In the first approach, the input will be tokens of words aiming to predict the next word after a given sequence resulting in a significantly large output layer. The latter represents the sequences to be characters, thus predicts the subsequent character of a sequence. The advantages of this is that the output layer will be significantly smaller than in the first approach.

Primary LanguagePython

Stargazers