sequence-to-sequence-models
There are 18 repositories under sequence-to-sequence-models topic.
awslabs/sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
duyvuleo/Transformer-DyNet
An Implementation of Transformer (Attention Is All You Need) in DyNet
AIRGOLAB-CEFET-RJ/stconvs2s
Code for the paper "STConvS2S: Spatiotemporal Convolutional Sequence to Sequence Network for Weather Forecasting" (Neurocomputing, Elsevier)
Heidelberg-NLP/SRL-S2S
Encoder-Decoder model for Semantic Role Labeling
AyanKumarBhunia/Handwriting-Trajectory-Recovery
Handwriting Trajectory Recovery using End-to-End Deep Encoder-Decoder Network, ICPR 2018.
winston-lin-wei-cheng/Chunk-Level-Emotion-Retrieval
The proposed framework to retrieve the continuous chunk-level emotions via emo-rankers for Seq2Seq SER
ashokurlana/Neural-Machine-Translation
This repository shows the implementation of the paper Neural Machine Translation by Jointly Learning to Align and Translate
robertocarlosmedina/attention-transformer-translator-1
Sequence to Sequence Transformer implementation in order to train a model to translate over Cap-verdian criole to English.
SayamAlt/Abstractive-Text-Summarization-of-News-Articles
Successfully developed an encoder-decoder based sequence to sequence (Seq2Seq) model which can summarize the entire text of an Indian news summary into a short paragraph with limited number of words.
SayamAlt/English-to-French-Language-Translation-using-Seq2Seq-Modeling
Established a deep learning model which can translate English words/sentences into their corresponding French translations.
srvk/jsalt-2018-grounded-s2s
Grounded Sequence-to-Sequence Transduction Team at JSALT 2018
aakanshadalmia/SumItUp
A concise summary generator for Amazon product reviews built using Transformers which maintains the original semantic essence and user sentiment
razamehar/IMDB-Sentiment-Analysis-BoW-S2S-Models
Sentiment analysis on the IMDB dataset using Bag of Words models (Unigram, Bigram, Trigram, Bigram with TF-IDF) and Sequence to Sequence models (one-hot vectors, word embeddings, pretrained embeddings like GloVe, and transformers with positional embeddings).
prashanthi-r/Eng-Hin-Neural-Machine-Translation
Sequence to sequence encoder-decoder model with Attention for Neural Machine Translation
SayamAlt/English-to-German-Translation-using-Seq2Seq
Successfully established a neural machine translation model using sequence to sequence modeling which can successfully translate English sentences to their corresponding German translations.