ymcui/Chinese-BERT-wwm

The way to get Chinese Word Embedding.

qhd1996 opened this issue · 4 comments

For some reason I need to get the Chinese word embedding based on PLM, would you offor some ways to get them based on the existing trained models? Thanks :)

ymcui commented

Please see https://github.com/google-research/bert#using-bert-to-extract-fixed-feature-vectors-like-elmo
With this script, you can extract word representation of any layers in BERT.

Please see https://github.com/google-research/bert#using-bert-to-extract-fixed-feature-vectors-like-elmo With this script, you can extract word representation of any layers in BERT.

Thank you very much!

stale commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale commented

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.