yeliu918/KG-BART

KG-augmented decoding is not applied

Opened this issue · 6 comments

Hello Ye, thanks for making the code public.

Not sure if I understand correctly, but from here it seems that the KG-augmented decoding layers are not applied during decoding. Could you let me know if there is anything I missed?

Also, could you check if there is any issue in the DecoderGATLayer? For example, from here, ex_entity may not be converted to embeddings before feeding into the exent_proj function.

Thanks again for the help.

wyu97 commented

I have the same question. I found class DecoderGATLayer is not utilized in the KG-BART decoder.

ana3A commented

Have you come to some conclusion about this?
Thank you in advance.

Hello Ye, thanks for making the code public.

Not sure if I understand correctly, but from here it seems that the KG-augmented decoding layers are not applied during decoding. Could you let me know if there is anything I missed?

Also, could you check if there is any issue in the DecoderGATLayer? For example, from here, ex_entity may not be converted to embeddings before feeding into the exent_proj function.

Thanks again for the help.

in the paper, the author did not use that layer in the pretraining process.
I don't know if you mean the pretraining stage.

Same question here, and it looks like the decoder in the code does not work as the decoder in the paper. In the paper, a concept needs to concatenate with each of its expanded entity, but in the code, it only concatenates once.

@yeliu918 Could you have a look and explain the code a little bit? I am interested in your work but have trouble reproducing it based on the code you gave. Thx a lot!!

@yeliu918 could you explain this? in decoder module in your paper, you say MHGAT is adopted. MHGAT corresponds to DecoderGATLayer class in your code, but it's not used actually. Why? is there any miswriting error?