crewAIInc/crewAI-examples

local llms without using Ollama -with vllm or huggingface

rajeshkochi444 opened this issue · 1 comments

Hi,

Can we use local llms through vllm or huggingface without using ollama?

Thanks
Rajesh

I'm also interessted in vllm connections.