FizzleDorf/AIT

Large positive prompts trigger 'Error in function: AITemplateModelContainerRun'

receptakill opened this issue · 1 comments

This is great, gives about a 40% speed boost for me when it works, but oddly going above above some upper limit of characters in the positive prompt throws a 'Error in function: AITemplateModelContainerRun' when executing. Syntax nor number of tokens doesn't seem to affect it. Nor does the same happen for negative prompt. And the limit is different depending on the nodes that are loaded, model used, etc but I don't notice a pattern there necessarily. With the default workflow with AITemplateLoader and AITemplateVAEDecode subbed/chained in, basev1.5pruned model, the character limit appears to be ~590 for instance.

Traceback attached. Any clue what's going on?
stack.txt

10f7510
This commit increased the size of the max clip chunks enabling 2,310 tokens.