Let's make `train_epoch_length` and `eval_epoch_length` optional
vfdev-5 opened this issue · 4 comments
vfdev-5 commented
Config config.yaml
files require two params: train_epoch_length
and eval_epoch_length
Let's make them optional.
Such that epoch length is defined by the input dataloaders.
DeepC004 commented
I can take up the issue
DeepC004 commented
Hi, may you please elaborate how input dataloaders define epoch length. The torch.utils.data.DataLoader
has no parameter defining epoch_length
.
vfdev-5 commented
@DeepC004 check this code for example how train_epoch_length
is used:
code-generator/src/templates/template-vision-classification/main.py
Lines 107 to 111 in b25945f