character-level-language-model
There are 16 repositories under character-level-language-model topic.
Anwarvic/Arabic-Tashkeela-Model
This is a diacritization model for Arabic language. This model was built/trained using the Tashkeela: the Arabic diacritization corpus on Kaggle
clovaai/group-transformer
Official code for Group-Transformer (Scale down Transformer by Grouping Features for a Lightweight Character-level Language Model, COLING-2020).
Sangarshanan/song-lyrics-generation-and-analysis
Lyrics Generation:notes: using LSTM , word2vec Analysis and more
susantabiswas/Article-Generator
Text Article generator using using Character level LSTM network.
tejaslodaya/character-level-language-model
Build a character level language model to generate new dinosaur names
razormin/Sequence-Models
Sequence Models coding assignments
MUHAMMADAKMAL137/IMDB-Dataset-Classification-using-Pre-trained-Word-Embedding-with-GloVec-6B
In this project, I worked with a small corpus consisting of simple sentences. I tokenized the words using n-grams from the NLTK library and performed word-level and character-level one-hot encoding. Additionally, I utilized the Keras Tokenizer to tokenize the sentences and implemented word embedding using the Embedding layer. For sentiment analysis
sjmiller8182/character-convolutions-classification
An implementation of "Character-level Convolutional Networks for Text Classification" in Tensorflow. See https://arxiv.org/pdf/1509.01626.pdf.
Subangkar/Sequence-Models-Deeplearning.ai-Coursera-Assignments
Notebooks of programming assignments of Sequence Models course of deeplearning.ai on coursera in May-2020
explanare/char-iit
A causal intervention framework to learn robust and interpretable character representations inside subword-based language models
Kiminjo/Character-level-language-model
It aims to write new sentences by learning character units sentences using RNN. As training data, a collection of Shakespeare's novels was used.
ksasso1028/vintage_vectors
retro style tokenization for language models
surrey-nlp/PLODv2-CLM4AbbrDetection
This repository contains the code and PLODv2 dataset to train character-level language models (CLM) for abbreviation and long-form detection released with our LREC-COLING 2024 publication
jungsoh/rnn-character-level-language-model
Recurrent neural network for building a character-level language model and its application to generating new dinosaur names
susantabiswas/Name-Gen-RNN
Name generation using RNN. This model was trained for generating indian names. Made using keras.