xiefan-guo/initno

endless loop

Opened this issue · 3 comments

This code runs in an endless loop at times

I second this, I tried the prompt "a dog and a tiger" with seed 50 and as the optimization goes on it does not reach optimal noise in the first 5 * 10 (meaning Tmax_round * Tmax_step ). So it goes on to do fn_initno for 50 steps and it just endlessly loops there.

I'm sorry for the delayed response. We tried the prompt "a dog and a tiger" with seed of 50 but did not encounter the issue you mentioned. Here is the output and log:
a dog and a tiger_50
log
I hope this has been helpful.

Hello, firstly thank you very much for you response. I forgot to mention earlier that I am using the stable-diffusion-2-1. I patched your code based on this Attend-And-Excite commit yuval-alaluf/Attend-and-Excite@15c30b1 which adds support for 2.1 based on this issue yuval-alaluf/Attend-and-Excite#14, basically changing the slicing of the cross attention maps in fn_compute _loss() like this: https://github.com/yuval-alaluf/Attend-and-Excite/blob/163efdfd341bf3590df3c0c2b582935fbc8e8343/pipeline_attend_and_excite.py#L198C8-L208C1

That would explain why you get a clean execution when you try the prompt and seed. I do 20 max_iter_to_alter (chose half of inference steps) because I later do 40 inference steps. I am using the code you provide until I get the optimized latents, which later I utilize in my own code. I acknowledge the fact that your paper states that you use v1 of SD. I am just wondering what would cause the endless loop in the initial noise optimization. If there is anything I missed or a fault in my thought process please do not hesitate to point it out.