Where is the regional backpropagation implemented?
erik-koynov opened this issue · 2 comments
Dear Mr. Nishimura,
I have been reading your paper with interest and decided to look into the implementation (thank you very much for providing legible code) of the region backprop as defined in Eq. 5 in your paper "Weakly supervised cell instance segmentation under various conditions". However, I could not find it. The closest I got to a backpropagation based region proposal method was in the guided_model.py aroung lines 160-180, namely in the function call: img_grad.sum(1).clone().clamp(min=0).cpu().numpy(). What I read from this line is namely - the gradient used is the gradient w.r.t. the input, but with ReLU applied on it, which as far as I can see is not the same as in the paper.
Where am I wrong?
Your paper is really interesting and I would really appreciate your clarification. :)
Yours,
Erik
Hi!
I appreciate your interest in my paper.
As you said, the "img_grad.sum(1).clone().clamp(min=0).cpu().numpy()" is not directly indicates Eq.5.
I think the code you are looking for is the guided_relu function.
To implement Eq. 5, I changed the backward function of the ReLU operation to the guided relu function by the following code.
def _patch(self):
for module in self.modules():
if isinstance(module, nn.ReLU):
module._original_forward = module.forward
module.forward = MethodType(guide_relu, module)
Does my answer make sense?
Kazuya
Hi Kazuya,
thank you for the quick reply. Yes, this was exactly what I was looking for, thanks a lot!
Erik :)