Load pretrained word embeddings (word2vec, glove format) into torch.FloatTensor for PyTorch
PyTorch required.
pip install torchwordemb
import torchwordemb
read word2vec binary-format model from path
.
returns (vocab, vec)
vocab
is adict
mapping a word to its index.vec
is atorch.FloatTensor
of sizeV x D
, whereV
is the vocabulary size andD
is the dimension of word2vec.
vocab, vec = torchwordemb.load_word2vec_bin("/path/to/word2vec/model.bin")
print(vec.size())
print(vec[ w2v.vocab["apple"] ] )
read word2vec text-format model from path
.
read GloVe text-format model from path
.