piEsposito/blitz-bayesian-deep-learning

Question about adding linear bayesian layers on top of a conv network

asadabbas09 opened this issue · 3 comments

Thanks for this awesome work.

I'm working with GCN networks, and I would like to introduce uncertainty in the predictions. I have added linear bayesian layers on top of my convolutional network (last layers).

I'm getting good prediction results with uncertainty.

I'm not sure that Is this a good way to introduce uncertainty or do I have to use weight uncertainty in every single layer of the network?

I have a similar issue. I suggest to add a MNIST example to show how the layers would work in a classification setting and with conv networks.

Hello @asadabbas09 and thank you so much for using BLiTZ. You are being very clever by introducing uncertainty on the last layer: it might solve your problem without huge memory usage and training time. In theory, that's ok, and in practice too. Even on the paper that introduced Bayes By Backprop (or the Flipout one, idk) suggests that usage.

So you are doing everything all right and it is not mandatory to use weight uncertainty in the whole network.

Best regards,
-Pi

Thanks for this awesome work.

I'm working with GCN networks, and I would like to introduce uncertainty in the predictions. I have added linear bayesian layers on top of my convolutional network (last layers).

I'm getting good prediction results with uncertainty.

I'm not sure that Is this a good way to introduce uncertainty or do I have to use weight uncertainty in every single layer of the network?

Nice idea.
Thank you.