madebyollin/maple-diffusion

Diffusing results in gray image

elrid opened this issue · 6 comments

elrid commented

No matter what I input to the model, and what noise do I start with, I get the same result: just a gray image
IMG_3291

Environment:

Xcode 14.1 beta 2
iOS 16.1 beta 3
iPhone 13 Pro

python-3.9 numpy-1.23.4 torch-1.12.1 pytorch-lightning-1.7.7

You're lucky you can run it on your iPhone 13 Pro! It gets memory issues on my iPhone 14 Pro Max. This screenshot is my iPad M1 running iPadOS 15.6. No issues. Might be because you're on the beta.

IMG_0338

@elrid Hmm, that's really strange! It could indeed be an iOS 16.1 beta issue (if MPSGraph is broken on the beta somehow). Does image generation work on macOS?

I sadly still have an intel Mac and it errors on build.

Screen Shot 2022-10-15 at 11 29 23 PM

Ah, got it, thanks for trying! We may be stuck at this point, unfortunately... I will try to get a phone running the iOS 16.1 beta, to see if I can reproduce the gray-image issue... but even if I reproduce it, I'm not confident I'll be able to find a workaround 😅

elrid commented

I've got a friend with 16.0, he has no issues with diffusion. However, each step takes 2 times longer than mine (4s vs 2s), and loading is 4 times faster (55s vs 13s).

Seems like weights are not being loaded on 16.1 beta, but there are no changes in MPSGraph framework that can cause this effect, so it looks like a bug on iOS:)

Happening on my M1 Macbook Air running the Ventura beta, but not happening on my MacBook Pro running Monterey 12.6, nor on my M1 iPad Pro running the 16.1 beta. App doesn't finish loading on my iPhone 13 mini (but that's not relevant to this particular issue).