Hi! Can I use word embeddings for pretrained user&item weight?
PeaXCD opened this issue · 1 comments
PeaXCD commented
I notice that the embedding_user and embedding_item are initialized by torch.nn.init.normal_, and there is a choice that whether we use pretrained weight or not.
In my dataset, the recommendation results are strongly correlated with the item names. And I want to use BERT to get word embeddings so that similar item names have similar embedding vectors.
So can I use word embeddings for pretrained user&item weight to get better performance?
GabyUSTC commented
We didn't try since we don't have this kind of datasets, but I think your idea is reasonable.