/NeuReduce_pytorch

https://aclanthology.org/2020.findings-emnlp.56/

Primary LanguagePython

NeuReduce_pytorch

https://aclanthology.org/2020.findings-emnlp.56/

Find out markdown

  1. How to Data-processing
  2. Implementation of Transformer
  3. Implementation of LSTM Based on Attention

Comparative evaluation result

Method Correct Ratio Result Length Solving Time
Transformer(Paper) 7824 78.24% 18.02 0.43s
Transformer(Ours) 7420 74.20% 16.2 0.055s

Example Expression

MBA expression : 2*(x&(y|z))+(~x|~z)-2*(y^z)+(~y^z)-2*(y^~z)-(x&y)
trg : -(~x&(y^z))

Results

Transformer predicted trg : -(~x&(y^z))
LSTM-Attention predicted trg : -(~x&(y^z))