How to increase the batch size
Opened this issue · 0 comments
hzlhzlhzlhzl commented
train_data = train_dataset.loader(
batch_size=1, num_workers=args['num_workers'], shuffle=True),I want to modify the batch size, but I encountered an error. Additionally, I've noticed that the memory usage is quite low when using the new version of torch. Do you have any idea why this is happening? Thank you for your assistance