IgorSusmelj/pytorch-styleguide

Don't use plain forward in python

justusschock opened this issue · 3 comments

In opposite to what you say in the section

A nn.module can be used on input data in two ways whereas the latter one is commonly used for better readability. self.net(input) simply uses the call() method of the object to feed the input through the module.

output = self.net.forward(input)
# or
output = self.net(input)

it is not recommended, to call the plain forward in python.

If this would, be the same, the __call__ method (which is called if you call a class) would be something like this:

def __call__(self, *args, **kwargs):
    return self.forward(*args, **kwargs)

But instead it is slightly more complex :

 def __call__(self, *input, **kwargs):
        for hook in self._forward_pre_hooks.values():
            hook(self, input)
        if torch._C._get_tracing_state():
            result = self._slow_forward(*input, **kwargs)
        else:
            result = self.forward(*input, **kwargs)
        for hook in self._forward_hooks.values():
            hook_result = hook(self, input, result)
            if hook_result is not None:
                raise RuntimeError(
                    "forward hooks should never return any values, but '{}'"
                    "didn't return None".format(hook))
        if len(self._backward_hooks) > 0:
            var = result
            while not isinstance(var, torch.Tensor):
                if isinstance(var, dict):
                    var = next((v for v in var.values() if isinstance(v, torch.Tensor)))
                else:
                    var = var[0]
            grad_fn = var.grad_fn
            if grad_fn is not None:
                for hook in self._backward_hooks.values():
                    wrapper = functools.partial(hook, self)
                    functools.update_wrapper(wrapper, hook)
                    grad_fn.register_hook(wrapper)
        return result

because it also deals with all the registered hooks (which wouldn't be considered when calling the plain forward).

Thanks,

I didn't know that. I added the changes. Do you know whether it only affects the "hooks" or whether it has an impact on the forward/ backward pass?

It does not have an impact on the forward/backward passes in general, but some more complicated models use these hooks as part of their logic, because it is highly dynamical.

thanks for the explanation!