Question about multihead_proj_global
Opened this issue · 0 comments
liuyueChang commented
Thank for your outstanding work!
- global_embed = self.multihead_proj_global(global_embed).view(-1, self.num_modes, self.hidden_size) # [N, F, D]
What is the purpose of this function? Why design this function?
- Here
Line 149 in 1ab4d4c
the parameters of this function are time tensor x_encoded_dense and spatial tensor hidden_state_global
but in
Line 122 in 1ab4d4c
the time tensor's name change to global_embed and the spatial tensor’s name change to local_embed, Is this correct?
Thank you!