langchain-ai/langchain

Support configuring `init_chat_model` via `context_schema`

Opened this issue · 2 comments

Checked other resources

  • This is a feature request, not a bug report or usage question.
  • I added a clear and descriptive title that summarizes the feature request.
  • I used the GitHub search to find a similar feature request and didn't find it.
  • I checked the LangChain documentation and API reference to see if this feature already exists.
  • This is not related to the langchain-community package.

Feature Description

Description:
Since version 0.6, the LangGraph documentation recommends avoiding RunnableConfig with configurable and instead using context_schema for static runtime context.

However, when initializing a chat model with init_chat_model, the setup still relies on configurable instead of context_schema. Conceptually, a model (and its parameters) also belongs to the static runtime context.

This creates an inconsistency in practice:

  • Some settings (e.g. model and its parameters) must be configured via configurable.
  • Other settings (e.g. prompt definition and versioning) must be passed via context_schema.

This separation feels unintuitive and makes project configuration less straightforward.

Feature Request:
Add support for configuring init_chat_model through context_schema so that all static runtime context can be managed consistently in one place.

Use Case

When building a project with multiple runtime configurations, I currently need to split configuration between configurable (for models and their parameters) and context_schema (for prompt definitions and versioning). This separation feels inconsistent and makes configuration management harder to maintain

Proposed Solution

No response

Alternatives Considered

No response

Additional Context

No response

What's the use case here? what do you want to pass in configurable and where do you want to access it?