openimsdk/openkf

Feature: Start LLM service with fastchat.

Closed this issue · 0 comments

Checklist

  • I've searched for similar issues and couldn't find anything matching
  • I've discussed this feature request in the OpenIMSDK Slack and got positive feedback

Is this feature request related to a problem?

❎ No

Problem Description

Enable multi llm models loader with fastchat frame.

Solution Description

Use fastchat.serve.controller, fastchat.serve.model_worker, fastchat.serve.openai_api_server instead.

Ref: https://github.com/lm-sys/FastChat/blob/main/docs/langchain_integration.md

Benefits

Enhance system.

Potential Drawbacks

No response

Additional Information

No response