fmassa/optimize-net

Torch object expected error on getParameters

szagoruyko opened this issue · 8 comments

Wanted to give it a try on imagenet and got this, errors out on getParameters.

luajit: Module.lua:218: Torch object, table, thread, cdata or function expected
stack traceback:
    [C]: in function 'pointer'
    Module.lua:218: in function 'flatten'
    Module.lua:292: in function 'getParameters'
    train.lua:148: in main chunk

Did you set the training mode for the optimizer ?

If you want to use it in evaluation mode and call getParameters, you should disable the gradParameter zeroing, by using the option removeGradParams=false.

yes works this way, missed this option in readme. thanks

Thanks for opening the issue. I'll improve the README with some more detailed information about the available options in optimizeMemory

Just a heads up on this, 1f69144 adds a note in the README about possible optimization options, 7e698ef disables getParameters in inference mode, and ea0b101 avoids double memory optimization. Those changes should hopefully make using optnet less error prone.
Thanks again @szagoruyko for opening this issue !

@fmassa Dear friend, I met the same problem. How to using the option removeGradParams=false in code?

@szagoruyko Dear friend, I met the same problem. What change did you make in your code?

I suppose all you should do is to pass removeGradParams=false in the code?
But I've not used/maintained this library for a long time, so things are not fresh in my mind.
I'd recommend using PyTorch now, because it contains better memory savings than what this library provides.