mengmengliu1998/GATraj

Question about multihead_proj_global

Opened this issue · 0 comments

Thank for your outstanding work!

  1. global_embed = self.multihead_proj_global(global_embed).view(-1, self.num_modes, self.hidden_size) # [N, F, D]

What is the purpose of this function? Why design this function?

  1. Here

mdn_out = self.Laplacian_Decoder.forward(self.x_encoded_dense, self.hidden_state_global, cn_global, epoch)

the parameters of this function are time tensor x_encoded_dense and spatial tensor hidden_state_global

but in

def forward(self, x_encode: torch.Tensor, hidden_state, cn) -> Tuple[torch.Tensor, torch.Tensor]:

the time tensor's name change to global_embed and the spatial tensor’s name change to local_embed, Is this correct?

Thank you!