KeyError: 'optimizer'
Closed this issue · 2 comments
Hello! Thank you for your wonderful work. After I finished training TSR、LaMa and try to train FTR, the following bug happens:
-- Process 0 terminated with the following error:
.
.
.
File "E:\paper\dl_and_ai\image_inpainting\image_completion_code\code2022ZITS_inpainting-main\src\models\FTR_model.py", line 244, in load_rezero
self.gen_optimizer.load_state_dict(data['optimizer'])
KeyError: 'optimizer'
I've carefully checked the orderly codes and ensured that when I trained LaMa, the key 'optimizer' had been saved with its state_dict by the function 'LaMaBaseInpaintingTrainingModule.save()', just like other keys such as 'iteration' and 'generator'. Fortunatelly, the codes successfully run when I comment out these two lines of code:
In FTR_model.py
line 244 # self.gen_optimizer.load_state_dict(data['optimizer'])
line 254 # self.dis_optimizer.load_state_dict(data['optimizer'])
I guess there are some errors with the state_dict of optimizer, cause it still works when loading other saved parameters.
By the way, may I add your WeChat for future academic exchanges? I learned about your work in teachure Fu's academic report in VALSE 2023. I am now a second-year master's student and I am interested in ZITS and ZITS++. This is my email and I'm looking forward to you reply: xqj1800801111@163.com.
Hi, you can check the checkpoint you have saved after the training of LaMa. There should be an item for the optimizer.
Though you can indeed train the FTR without loading the optimizer, there is a dramatic performance drop during the initial several hundred or thousand steps. Training longer may alleviate this and get proper performance.
My WeChat ID is qwe845015004
Thanks for your interest in our work.