Is there Attention layer in the code?
Closed this issue · 2 comments
arsentiii commented
Is there Attention layer in the code?
Paralleled with SkipGRU module.
arsentiii commented
and I really wanna know that what path should I append at main.py for "import util".
Alright. I just ignored the util file in another repo.
arsentiii commented
I had run the code with GPU.
But I got a faster run with version of torch.
Why tensorflow is lower than torch?