KeyError in backward_step Method Due to Missing depth0 in batch Dictionary
HITESH2002-JAIN opened this issue · 1 comments
I have found a issue with the code, here is the line:
mickey/lib/models/MicKey/model.py
Line 128 in 9cf842a
There is a condition that checks if
batch['depth0'].requires_grad
before performing a backward pass. However, the batch dictionary does not have a key depth0, which can cause an issue.Here's the problematic code segment:
elif batch['depth0'].requires_grad:
torch.autograd.backward((torch.log(batch['final_scores'] + 1e-16),
batch['depth_kp0'], batch['depth_kp1']),
(probs_grad[0], outputs['depth0'].grad, outputs['depth1'].grad))
Is this step necessary? The default training implementation does not execute this segment of code ? Please let me know if I have mistaken
Hello,
Thank you for spotting that. It is indeed a bug - the default training code does not execute that segment since it is only used when we want to avoid training the 2D keypoint offsets. If you do not train the 2D keypoint offsets, then you might want to use that segment (and train depth maps, descriptors and keypoint scores).
To do so,
elif batch['depth0'].requires_grad:
should be:
elif batch['depth_kp0'].requires_grad:
I will push the change shortly, thanks again!