Error when switching to test mode during training
yx-chan131 opened this issue · 5 comments
Hi, @xuhangc I encounter the same problem as you. May I know how you fix this problem?
As @yuhaoliu7456 suggested, I did write an if statement to skip input['param'] during the test.
if 'param' in input: #input['param] does not exist during the test
self.shadow_param = input['param'].to(self.device).type(torch.float)
However, I found that variable self.shadow_param will be used in forward method.
addgt = self.shadow_param[:, [0, 2, 4]]
mulgt = self.shadow_param[:, [1, 3, 5]]
Therefore I still encounter error when switching to test mode during training.
Originally posted by @yx-chan131 in #27 (comment)
Hi, this addgt and mulgt are not used later, you can annotate it.
@fl82hope Hi, thanks for your prompt reply. To be more clear, did you mean that the variables addgt and mulgt are the same as input['param'] which will not be used during testing mode?
Thanks!
@fl82hope Hi, thanks for your prompt reply. To be more clear, did you mean that the variables addgt and mulgt are the same as input['param'] which will not be used during testing mode?
Thanks!
Yes, exactly.
I change self.shadow_param to shadow_param_pred depending on the flag of training state and the code works. But somehow the result will turn out to be very bad. I suppose there should be another better solution.
@fl82hope Hi, thanks for your prompt reply. To be more clear, did you mean that the variables addgt and mulgt are the same as input['param'] which will not be used during testing mode?
Thanks!
I also encountered the same problem as you, how do you modify Fusion_model.py to make it run normally?