/dl-attention

Attention models

Primary LanguagePython

dl-attention

Update me!

Attention models at DL Paris workshop session 3

Learn and implement attention models on RNNs

Tasks and Datasets

Character level addition task taken from https://github.com/fchollet/keras/blob/master/examples/addition_rnn.py

Inputs are found in data_numbers.csv or generated in generate_data.py.

Each line is formatted like: "123+89 |212" ; given the sequence "123+89" The decoder must produce the characters "212".

Other tasks to consider