city96/ComfyUI-GGUF

tools how to convert safetensors to fp8

xueqing0622 opened this issue · 2 comments

this cmd convert to F16.gguf, I want to convert directly to F8.gguf
python convert.py --src E:\models\unet\flux1-dev.safetensors

As far as I know, current numpy does not support FP8 as a datatype, and gguf does not define it as a separate datatype either (as it does with bf16, for example) hence why the output of the intermediary file is in FP16 or BF16.

image

Thanks for the answer, the gguf plugin and tools you made are really so useful!