/chinese-char-rnn

Character-Level language models

Primary LanguagePython

Chinese Character-Level Language Model

Recurrent Neural Networks(LSTM, GRU, RWA) for character-level language models in Tensorflow, the task is to predict the next character given the history of previous characters in the sentence, nce-loss is used to speedup multi-class classification when vocab size is huge, dataset was web scraped from Hong Kong Apple daily

Results

Result

Similarity

Similarity

Requirements

tensorflow 1.1.0