pytorchbearer/torchbearer

Support custom data loaders

MattPainter01 opened this issue · 0 comments

The current way to use a custom data loader is to swap the default sampler for the custom one with a callback that runs on_start_training and on_start_validation. This breaks our infinite data loading which wraps a data loader in the inject_sampler decorator called on _fit_pass.

We should have a easier way to do this that doesn't break infinite loading and doesn't require an extra callback. Perhaps a with_data_loader fluent trial method.