create / load embeddings
gogobd opened this issue · 2 comments
gogobd commented
I was trying out unsupervised-nmt and I got it running! I'm running everything from scratch. What I can't see at the moment is, if I would have to save the word embeddings (src_emb and tgt_emb) myself after each cycle, or are they just included in the training cycle's save files? I created new ones and did not use any of the pre-trained ones.
Thanks for the code by the way! It's really amazing!
guillaumekln commented
There are including in the checkpoint files. It's relatively easy to load and extract variables from a TensorFlow checkpoint. See for example:
https://github.com/OpenNMT/OpenNMT-tf/blob/v1.11.0/opennmt/utils/checkpoint.py#L97-L109
gogobd commented
Thank you very much, @guillaumekln