PRBonn/lidar-bonnetal

Question about stub

iris0329 opened this issue · 2 comments

Hi,
Thank you for generously opening up this code !

I found that there is a piece of code as:

       #  print number of parameters and the ones requiring gradients
        stub = torch.zeros((1,
                            self.backbone.get_input_depth(),
                            64,
                            1024))
        if torch.cuda.is_available():
            stub = stub.cuda()
            self.backbone.cuda()
        _, stub_skips = self.backbone(stub)

I am curious about the function of this piece of code.
Although there is a line of comments here, I still don't quite understand what the code does.
Could you please give more detailed explanation?

I am looking forward to your reply

Best wishes !

Hi,

This is a quick and dirty way to check the sizes of the activations in the encoder when I build the decoder. If you look at the backbone execution, it returns 2 values: the last feature volume, and all the feature volumes at all output strides (stub_skips in this case). This is later passed to the constructor of the decoder so it can verify that its layers that use skip connections have the proper sizes.

@tano297
Thank you for your reply.
Does that mean "If I comment out these lines, it will not affect the function of the program"?