aredden/flux-fp8-api

Load a LORA using the API

acaladolopes opened this issue · 3 comments

First of all @aredden , great work! Thank you so much for everything

I don't have a capable GPU and I use vast.ai a lot to rent 4090s, so this API is fantastic for my requirements. The only issue I have now is with lora loading.

Is there currently a way to load a lora using the API or to start the API with a parameter for the lora path? any of the solutions would work for me. I'm a C# guy and I don't want to mess with your FluxPipeline in python directly.

Thanks again!

Ah, have you tried the removable-lora branch here? https://github.com/aredden/flux-fp8-api/tree/removable-lora - it has exactly what you're asking. (I believe, unless I havn't pushed the api yet, unsure)

Ah, I guess I had not added it- but I just did.

I merged the changes into the main branch so hopefully that will help (also updated readme)