The code seems not to be the same as in the paper
keyboardpianist opened this issue · 2 comments
keyboardpianist commented
In preprocess.py it generates context pairs during 1 to T, and the model learned from 1 to T to predict link in time T?
def get_context_pairs(graphs, num_time_steps):
""" Load/generate context pairs for each snapshot through random walk sampling."""
load_path = "data/{}/train_pairs_n2v_{}.pkl".format(FLAGS.dataset, str(num_time_steps - 2))
try:
context_pairs_train = dill.load(open(load_path, 'rb'))
print("Loaded context pairs from pkl file directly")
except (IOError, EOFError):
print("Computing training pairs ...")
context_pairs_train = []
for i in range(0, num_time_steps):
context_pairs_train.append(run_random_walks_n2v(graphs[i], graphs[i].nodes()))
dill.dump(context_pairs_train, open(load_path, 'wb'))
print ("Saved pairs")
return context_pairs_train
maybe for i in range(0, num_time_steps - 1):
?
zhaohaixiangbobo commented
in the mian function ,it replace the graph[-1] edge with graph[-2] edge
trytodoit227 commented
hello, do you know how to generate negative samples in models.py