ceruleandeep/ComfyUI-LLaVA-Captioner
A ComfyUI extension for chatting with your images with LLaVA. Runs locally, no external services, no filter.
PythonGPL-3.0
Issues
- 5
failing to install llama
#4 opened by alenknight - 6
[Bug] LLaVA Captioner appears to leak VRAM
#11 opened by curiousjp - 3
- 1
Feature request: Output list of strings
#13 opened by Battleshack - 0
Llava Next
#12 opened by jjohare - 8
No module named 'llama_cpp'
#1 opened by theonetwoone - 0
Feature Request
#10 opened by filliptm - 0
error, failed to create llama_context
#7 opened by altruios - 1
Error when using load img list
#6 opened by suede299 - 1
- 0
model reading question
#2 opened by soldivelot