mateuszbuda/brain-segmentation-pytorch

Very large images on my dataset

RoxySJ opened this issue · 0 comments

Hello,
I'm trying to train this on my own dataset, which contains very large images (7844 x 7786 for example). So, what I'm doing is slicing my images in 256x256 tiles and treating my large original images as your "patients". So my volumes arrays are like [860, 256,256,3], where 860 is the number of slices of an image. But I'm having memory problems when I try to create the dataset. When It gets to crop_sample or pad_sample functions my memory just can't handle. I'm trying to fit generator expressions in your dataset structure, but I'm not being very successful. I've never used generators before and might be doing something wrong, so I'm still working on it. But I would like to know if you have any suggestion on the matter or if you don't think your network will work with such big images.