Support for Sarvam Chat Model Integration
Closed this issue · 3 comments
Checked other resources
- This is a feature request, not a bug report or usage question.
- I added a clear and descriptive title that summarizes the feature request.
- I used the GitHub search to find a similar feature request and didn't find it.
- I checked the LangChain documentation and API reference to see if this feature already exists.
- This is not related to the langchain-community package.
Feature Description
It would be great if LangChain could add support for Sarvam AI
chat models. Sarvam provides LLMs optimized for Indian Languages understanding and efficiency, which could be valuable for a wide range of LangChain applications.
Use Case
Sarvam offers strong multilingual capabilities (especially for Indic and low-resource languages).
Having it as a first-class ChatModel in LangChain would enable developers to use it seamlessly within chains, agents, and RAG pipelines.
This would expand the ecosystem’s coverage of open/accessible models.
Proposed Solution
Add a new integration in langchain/chat_models similar to existing providers.
Implement SarvamChat class following the BaseChatModel interface.
Support configuration via API key and model parameters (temperature, max tokens, etc.).
Also, if this feature would be appreciated, I’d be happy to work on the implementation and open a PR.
Alternatives Considered
Right now, developers would need to write custom wrappers outside LangChain. An official integration would standardize usage and improve developer experience
Additional Context
No response
+1
I Will be!!!
Langchain-sarvam
Pypi :-https://pypi.org/project/langchain-sarvam/
Github :https://github.com/parth1609/langchain_sarvam
Check it out.