LAION-AI/dalle2-laion

Can the inference script be somehow modified to consume less than 12 GB RAM?

XieBaoshi opened this issue · 2 comments

Otherwise it will crash with Colab free tier, and won't even download the prior.

The two models combined are larger than 12gb so there is not much we can do about the newest models. I will be adding a config option to load just in time instead of preload, but it won't help as we scale further. Unfortunately, this is just where ML is headed.

nousr commented

I believe this has been answered, if you have further questions feel free to re-open the issue with additional details regarding your question :D !