Seed does not work for autoencoder (not reproducible results)
Closed this issue · 1 comments
omid-ghozatlou commented
As you when we set seed to any number except -1, the result must not change by rerunning. however, I set the seed to 10 and use preparing. So the results of both pretraining and training change widely. It is worth mentioning that if you set pretraining False, the network is trained without any pre-trained weights and the result of many times training is the same for the specific seed value. (of course except -1)
Does anybody know why seed (use for reproducibility) does not work for autoencoder?
omid-ghozatlou commented
I solved the problem by adding folllowing lines for Cuda:
torch.manual_seed(cfg.settings['seed'])
torch.cuda.manual_seed(cfg.settings['seed'])
torch.cuda.manual_seed_all(cfg.settings['seed'])
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False