How do you deal the Problem about Memory Error when your patch sets 760000 for CHASE dataset?it 's too big
BOBKINGS1101 opened this issue · 4 comments
Hi, without any modification of the code, you can increase system swap size; or you can change the dataloader, so you can load each batch on the fly without pre-storing all in memory.
Thank you! How to change the dataloader to load the batch on the fly?
I trained and test on the STARE dataset. But found the result is not very good (Se very low). Did you train on the STARE dataset?If you do some practice on this Datasets, Could you tell me how many patches you cut in this dataset?
The same as in the code, very big number of patches. You can first generate those patches and store them on drive, then write a DataSet class to read in batch-by-batch as here https://pytorch.org/tutorials/beginner/data_loading_tutorial.html