gzuidhof/nn-transfer

I found why it doesn't work!

Opened this issue · 6 comments

the "view" in pytorch and "flatten" in keras work differently.
To fix this problem add this layer before "Flatten":

x = Lambda(lambda x: K.permute_dimensions(x, (0, 3, 1, 2)))(x)

Thanks! I would love to see a PR for this. Let me know if I can help

Great work!

Despite your solution I dont seem to get accurate results, I havent done a direct comparison yet, Ive just scanned over the results. Im going to tinker with it a bit more before I post my model and results. Im not really sure if it will be useful though since its a pretty deep model.

I can say now that the model uses Relu activation layers, Maxpool2D, Conv2d, and batchnorm.

In my project I used exactly this layers: Conv2d, MaxPool, BatchNorm and relu activation and I could get the same output

Im not really sure if Ive made a mistake, do you think MaxPool2D works properly? I'll try to step through it later to see if it produces the right output.

It worked for me