stochasticai/x-stable-diffusion
Real-time inference for Stable Diffusion - 0.88s latency. Covers AITemplate, nvFuser, TensorRT, FlashAttention. Join our Discord communty: https://discord.com/invite/TgHXuSJEk6
Jupyter NotebookApache-2.0
Issues
- 0
Frameworks supporting Mac
#54 opened by taylorgoolsby - 0
a bug in /Tensorrt/demo.py
#52 opened by kx-kexi - 1
batch and uncapped token support for tensorrt
#37 opened by nub2927 - 0
- 1
- 0
Does onnxruntime slower or faster?
#39 opened by lucasjinreal - 0
Tensorrt converter
#38 opened by Dungtt94 - 6
ValueError: only one element tensors can be converted to Python scalars--Error in line 88 of demo.py:
#24 opened by AIManifest - 11
tensorrt conversion fails
#19 opened by harishprabhala - 1
Can the stable diffusion model we use integrate a CKPT model and deploy it to WebUI? If so, how?
#36 opened by ClementCJ - 1
unet Onnx file issue?
#35 opened by TalhaUsuf - 1
- 0
PyTorch Baseline perhaps too weak?
#31 opened by xinli-git - 3
Currently we only support a100 and t4 gpus. We may support other gpus in the future
#29 opened by appleatiger - 2
- 2
Is this supporting Stable Diffusion V2?
#23 opened by Geeksongs - 3
Add deepspeed, xformers, kernl, transformerengine, ColossalAI, tritonserver, VoltaML, etc
#21 opened by 0xdevalias - 1
Can't run colab notebook
#15 opened by Andreilys - 1
Can not reproduce the TensorRT result
#14 opened by lileilai - 3
Tensorrt Dynamic Height&Width
#13 opened by col-in-coding - 1