pytorch-ignite/code-generator

Let's make `train_epoch_length` and `eval_epoch_length` optional

vfdev-5 opened this issue · 4 comments

Config config.yaml files require two params: train_epoch_length and eval_epoch_length
Let's make them optional.

train_epoch_length: 1000
eval_epoch_length: 1000

Such that epoch length is defined by the input dataloaders.

I can take up the issue

Hi, may you please elaborate how input dataloaders define epoch length. The torch.utils.data.DataLoader has no parameter defining epoch_length.

@DeepC004 check this code for example how train_epoch_length is used:

trainer.run(
dataloader_train,
max_epochs=config.max_epochs,
epoch_length=config.train_epoch_length,
)

@vfdev-5 I have a doubt regarding the optional values. Aren't the values specified for train_length_epoch and eval_length_epoch already the default ones? Incase the user does not input a custom value for a parameter then default ones will be used as defined by the code.