benedekrozemberczki/SimGNN

line 213 of simgnn.py: losses.backward(retain_graph=True)

ChengzhiPiao opened this issue · 0 comments

In the process_batch() function of SimGNNTrainer, why is the retain_graph attribute set to be True when conducting loss backward?
I'm not that sure about the necessity of this setting.