daigo0927/pwcnet

Regarding sharing weights in the feature pyramid

Simpatech-app opened this issue · 3 comments

Thanks for your great and clear implementation.

However, I can not run the code , I am receiving this error ,Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope.
Where I wanna run the AdamOptimizer and it is complaining for returning an existing variable for feature pyramid.
I appropriate if you can help to solve this error.

Hmm..., I'm sorry I can't figure out the cause right now.
Is it the train.py which you have tried to run? Can you share the environment (i.e. options and library version ...) when the error occurred?

Exactly I am trying to run the train.py. I got to know I can run it ,but I have to restart the kernel every time.
I am running on window 10, Spyder 3.2.8, Python 3.5.4 64bit and tensorflow 1.9.

Thanks

I think Spyder keeps variables on the workspace, including weights of feature extractor, though after training.
If the variables remained when the same name variables generated, above error occurs.
Restarting kernel can clear those variables (though it is boring), I recommend running tf.reset_default_graph() which can also clear whole tensorflow variables.