Utilizing larger training datasets via a "chunked" generator
pkerins opened this issue · 0 comments
pkerins commented
As documented in #16, we are currently unable to train models using training datasets beyond a certain size. Brookie has suggested and outlined a workaround, which he implemented in the repository.
This code needs to be proofread and tested to ensure it loads up batches properly. Then, it needs to be inserted into a normal model training workflow and tried out. A model successfully trained using this alternative generator would prompt closure of this issue--how well that model performs is a separate item.
For extra context and explanation, you can see the posts in Slack starting here.