Mukosame/Zooming-Slow-Mo-CVPR-2020

Is there any way to solve cuda out of memory problem when input image is large?

universewill opened this issue · 1 comments

Is there any way to solve cuda out of memory problem when input image is large?
My input is about 1240x650 large. How to get around with this problem?

Hi @universewill , your input is really large! Although there are some tricks (see this issue) that we can try to make the network accept larger images, I'm sorry that your input would still be too large.
Sorry again for I haven't presented a script supporting large image inference. If you are willing to re-write the dataloading & test function, here is a general idea about how to achieve it:

  • First set a threshold based on you gpu memory caps. And crop your input with a padding region iteratively to make every patch meet the threshold.
  • Send the patch sequences to the inference, and stitch them back iteratively in the same manner.
    In this way, you should be able to test on large input images. (Just pay attention to the overlap area and make them blend naturally.)
    Hope this solve your question!