nicholas-leonard/dp

Transfer Learning - how to "fix" weights in certain modules

jrbtaylor opened this issue · 1 comments

I'm working on an application where a CNN feeds into an RNN and am attempting to use an ImageNet pre-trained CNN. Is there a way to fix the weights in just that one module so they're not updated during training?

@jrbtaylor Yes. Just overload the accGradParameters call :

module = loadMyImageNetModel()
module.accGradParameters = function(self) end