fbadine/LSTNet

Is there Attention layer in the code?

Closed this issue · 2 comments

Is there Attention layer in the code?
Paralleled with SkipGRU module.

and I really wanna know that what path should I append at main.py for "import util".

Alright. I just ignored the util file in another repo.

I had run the code with GPU.
But I got a faster run with version of torch.
Why tensorflow is lower than torch?