Why each activation function in each node has different id?
jinxin0924 opened this issue · 0 comments
jinxin0924 commented
@dukebw Hi, thanks for your code. I have a detailed problem:
In your code, you build embedding :
num_total_tokens = sum(self.num_tokens)
self.encoder = torch.nn.Embedding(num_total_tokens,
args.controller_hid)
This code shows that each previous node and activation function in different nodes have different ids. I am wondering about it. It would be great if you could help check this.