loudinthecloud/pytorch-ntm

What's the meaning of using memory?

Closed this issue · 1 comments

Dear sir:
I have read your code and I really appreciate your work.But I have get some questions.

  1. self.register_buffer('mem_bias',torch.Tensor(N,M)) #mem_bias was used as buffer, which means it would not be update
  2. self.memory =self.mem_bias.clone().repeat(batch_size,1,1) # self.memory was create using mem_bias to match the batch_size
  3. for each batch, we run init_squence(),
    image
    which reset the memory,and the reset function ,
    self.batch_size = batch_size
    self.memory = self.mem_bias.clone().repeat(batch_size, 1, 1)

just clean all the content in the memory and initialize it with mem_bias.
So what's the point to write and read from memory ? it just be the same with mem_bias each batch and mem_bias is not updated which means it's never changed.
I think I just could not figure it out and I would readlly appreciate it if you could answer my question.

I have read the code and I found that X have several length ,for each batch it have got several input.
It seems that the batch size is very import . But what could I do if the length of sequence is different when deal with NLP or audio sequence?