Enable tool calling in openai compatible server for all other models.
hpx502766238 opened this issue · 0 comments
hpx502766238 commented
in vllm 0.6.0+,auto function calling has been supported.https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html#automatic-function-calling,It seems to support function calling of different models by using a unified tool parsers.
These article may be help:https://huggingface.co/blog/unified-tool-use
https://qwen.readthedocs.io/en/latest/framework/function_call.html
I expect llama-cpp-python to support t auto function calling of qwen2.5 in the future.
Originally posted by @hpx502766238 in #1690 (comment)