Feature addition with attention code question
Closed this issue · 1 comments
Ikhwansong commented
Hi, Nice to see you again! Thank you for your kind reply last.
I have a question about the line of code for feature addition with attention.
Is it different, compare of below ?
x_index[i,j] = self.rdb_module'{}_{}'.format(i, j-1)
- self.downsample_module'{}_{}'.format(i-1, j)
It seems same operation.
thank you
proteus1991 commented
Your write-on code is the case while the attention is not used.
Thanks,
Xiaohong