ewrfcas/MVSFormer

KeyError: 'optimizer'

Liupei-Luna opened this issue · 6 comments

when training,is_finetune = True, it appears KeyError: 'optimizer'

By default, we finetune our model based on previous optimizer weights. If you are using our provided weights without optimizer, you could comment this line.

optimizer.load_state_dict(checkpoint['optimizer'])

ok, it also appears,
Error line : self.scaler.step(self.optimizer)

assert self._scale is not None, "Attempted {} but _scale is None. ".format(funcname) + fix
AssertionError: Attempted step but _scale is None. This may indicate your script did not use scaler.scale(loss or outputs) earlier in the iteration.

Strange problem.
If you have set fp16, scaler.scale should be run in

if self.fp16:
self.scaler.scale(loss).backward()
else:
loss.backward()

the default set is fp16 = true, can it be changed to false
if self.fp16:
self.scaler.scale(loss).backward()
else:
loss.backward()
this is the same of my code, but the question appears

fp16 could be set to False, which is only used to save memory and speed up the training.

Same error has occured:
AssertionError: Attempted step but _scale is None. This may indicate your script did not use scaler.scale(loss or outputs) earlier in the iteration.
@Liupei-Luna I would like to ask if the problem has been solved? Thanks.
@ewrfcas I use the default config with "config_mvsformer-p.json", only batchsize is changed to 2 (Beacuse i only have 2 RTX 3090 cards with 24GB memory).