Init embedding random OR by bert encode from entity description text
fridayL opened this issue · 1 comments
fridayL commented
Hi, thanks a lot for your work in KGE, but I still am confused about init embedding, I tried to init embedding from bert through entity text information, but when train model, the neg triplets loss seem still upgrade, I have change LR or other hyperparameters, but not work,
Will different init embedding have different results to model?
Edward-Sun commented
Yes. Currently, we only tried randomly initialized embedding, and the phases of the complex embedding are initialized exactly in [-\pi, +\pi]. For your case, I guess only initializing the modulus as BERT embedding and randomly initializing the phase would work better.