YangLing0818/RPG-DiffusionMaster

Comfy Implementation

ShadoW-Shinigami opened this issue ยท 26 comments

Are there any plans for a comfyui node Implementation for the same?

Yeah, comfy workflow is flexible. I will try to do that.

second this! would LOVE to see this in comfyui

Superb idea.

Thanks for your valuable suggestions, we will try to do that and welcome collaborators

Yes this would be awesome !

Just chiming in here to say, I too, would love to see this on ComfyUI. :)

Thanks for your hard work with RPG!

Super excited to see this @threefoldo

oh great news! Any ETA?

+1 for supporting ComfyUI by @comfyanonymous

Super excited to see this @threefoldo

Haven't maken any progress. The code is mixed with webui code, not sure how to seperate them.

As a workaround if the code is too intertwined with the GUI, could we not create a wrapper node which calls the RPG python script passing in the arguments, and return the output image from it? Just as a temporary solution.

As a workaround if the code is too intertwined with the GUI, could we not create a wrapper node which calls the RPG python script passing in the arguments, and return the output image from it? Just as a temporary solution.

Probably, yes, but ComfyUI wouldn't be aware of the resources that RPG use, so it could OOM people loading up ComfyUI models and RPG models.

Sounded easier when I thought of it but I didnt think about the 23.5 GB VRAM you need on LLM single alone when region prompting in the areas... will be tough.

Sounded easier when I thought of it but I didnt think about the 23.5 GB VRAM you need on LLM single alone when region prompting in the areas... will be tough.

Yes, the runtime memory is about 15GB VRAM, not sure about the peak value. It's still a valid solution to treat this as a single node in comfyui.

There's code for regional in Comfy in @ltdrdata's Inspire Pack that may be useful: https://github.com/ltdrdata/ComfyUI-Inspire-Pack

This repo looks like a game changer. A Comfy Implementation would be awesome.

Super excited to see this @threefoldo

Haven't maken any progress. The code is mixed with webui code, not sure how to seperate them.

Hi @threefoldo , I created an WebUI extension sd-webui-rpg-diffusionmaster. Maybe you can find some inspirations from it.

Why exactly was this issue closed its is not resolved. There is an webui extension available but no comfy UI implementation as far as https://ltdrdata.github.io/ listing.

Hello @threefoldo do you have news for us, I hope you didn't give up on this.

Xyem commented

I'm trying to replicate how this works with my oobabooga nodes. I'm currently experimenting with ImpactPack's regional prompt nodes, so I can figure out how to tie them together.

EDIT: My current thinking is that there will be a node to provide a suitable input to the Oobabooga nodes, another to extract the values from the LLM response, and another to convert those to the regional prompts which can be fed into the regional prompt sampler. Feel free to speak up if any of that sounds like it won't work (or you have an alternative idea!)

I'm trying to replicate how this works with my oobabooga nodes. I'm currently experimenting with ImpactPack's regional prompt nodes, so I can figure out how to tie them together.

EDIT: My current thinking is that there will be a node to provide a suitable input to the Oobabooga nodes, another to extract the values from the LLM response, and another to convert those to the regional prompts which can be fed into the regional prompt sampler. Feel free to speak up if any of that sounds like it won't work (or you have an alternative idea!)

If you want to apply a Regional Prompt, you might need an adapter node that generates a mask on a grid basis using a text prompt.

Xyem commented

If you want to apply a Regional Prompt, you might need an adapter node that generates a mask on a grid basis using a text prompt.

Yes, this would be one of the things that the node that converts the extracted values to regional prompts would do. It would be also generating the mask for the area the LLM defines for each region.

I have already got my Oobabooga nodes set up to get a response that I think it similar to what RPG-DiffusionMaster would be outputting (I can't check because don't have enough VRAM to run it). I'm currently getting to know the RegionalPrompt nodes from your Packs before I start coding these extra nodes to get them working together.