malllabiisc/WordGCN

Verse BERT

modongmoqiu opened this issue · 1 comments

I think your work is really interesting. But Does it have any meaning after the BERT model?
It seems your work is like word2vec or Glove, these static word embedding. Can it encode text dynamically like BERT?

Hi @modongmoqiu,
Yes, that will be a nice future work but we haven't explored that in this work.