ai-dock/comfyui

Progress updates from RunPod serverless while generating

simonjaq opened this issue · 3 comments

Hi. I have successfully deployed a video-to-video app with a React frontend and ComfyUI backend using AI dock. Only issue I have is that I would like to give the user a clear feedback how long generation will take. Since it’s video it can run up to several minutes and my workaround is to calculate an estimate of how long it will take and start a timer as soon as the job is in progress.
But of course it would be much more elegant to get actual updates from the Comfy backend. Is this somehow possible maybe even with the existing container or by modifying it?

I'm currently re-working serverless to make it compatible with a variety of infrastructure providers. Alongside this I'm looking at methods to provide improved feedback for long-running processes.

Runpod have several options for enabling this, but I am working on a solution that is less reliant on any one providers built-in provision - This will allow improved scaling and fail-over.

Although I don't yet have a concrete plan for this I expect it will involve running a self-hosted intermediate server for request management and load balancing with a feedback loop. This should allow the handler to issue periodic updates which can be polled by your interface (streaming to follow ideally, but not immediately).

Hopefully this will solve your issue along with bringing more stability and flexibility in deployment.

Thanks. Yes this would be absolutely perfect.
I was also thinking in a similar direction that if the serverless backend could stream job-id and the comfy log to a server it would be easy to build a precise progress bar from that.
I know that serverless has a progress_update feature but I couldn't figure how to actually use that.
If you have any docs or examples to understand this better it would be great. I can share if I figure out a method to potentially be implemented in your bigger re-work.

Previously supported handlers depreciated. New serverless methods to be documented in due course