graykode/nlp-tutorial

about skip-gram code

tarogege opened this issue · 0 comments

I don't quite understand that why 'batch_inputs', 'batch_labels' should be updated during each loop in Word2Vec-Skipgram-Tensor(Softmax).py .

Also ,what does 'trained_embeddings = W.eval()' mean?

Could you explain it for me?I am a bit confused.

`# code

for epoch in range(5000):
    batch_inputs, batch_labels = random_batch(skip_grams, batch_size)
    _, loss = sess.run([optimizer, cost], feed_dict={inputs: batch_inputs, labels: batch_labels})

    if (epoch + 1)%1000 == 0:
        print('Epoch:', '%04d' % (epoch + 1), 'cost =', '{:.6f}'.format(loss))

    trained_embeddings = W.eval()`