vlgiitr/ntm-pytorch

Batch size doesn't work

Opened this issue · 0 comments

Hi, I'm trying to use ntm in my project.

It seems that the batch size option not really works. I have checked the size of weights and memory in read function from memory.py, but the size didn't change while I'm changing the batch size option.

can you please check this?
If I'm wrong, could you please let me know how to evaluate the model with mini batches?