What happens when we clip batch_size ?
RomainGoussault opened this issue · 2 comments
RomainGoussault commented
The batch size is clipped here --> https://github.com/SubstraFoundation/distributed-learning-contributivity/blob/b72fa98c0b4db45d368f577d0f6d1a861b1610c2/scenario.py#L584
So if it's clipped it means that sometimes we don't use all the data in the dataset. But we don't give any feedback to the user. We should check when it happens and inform somehow the user.
bowni commented
- Update
MAX_BATCH_SIZE
(value to be determined) - Add a log when the clipping is triggered (log leve
INFO
)
arthurPignet commented
I think that the MAX_BATCH_SIZE should be related to the GPU memory available, so it will be dataset dependant