google/nerfactor

MLPs wrong skip connection

SirSykon opened this issue · 0 comments

Hello,

According to paper, MLPs skip connection is on layer 2 (starting from 0 is the third one) but using nerfactor I've seen strange behaviour so I checked the code. Network .call() in mlp.py does the following:

    x_ = x + 0 # make a copy
    for i, layer in enumerate(self.layers):
        y = layer(x_)
        if i in self.skip_at:
            y = tf.concat((y, x), -1)
        x_ = y
    return y

So the concatenation is applied after calling the layer and therefore the true skip connection is at the next layer (the fourth one).