langchain-ai/langchainjs

Qianfan chat models not handling abort signal

stanoswald opened this issue · 2 comments

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

const result = await runnable.invoke(
        // ...
        { signal: this.abortController.signal, callbacks },
      );

Error Message and Stack Trace (if applicable)

No response

Description

When using streaming output to call Qianfan chat models, the pre-bound AbortController doesn't work as expected. Instead, the message continues streaming until completion.

After reviewing the code, I think this might be because the _streamResponseChunks function doesn't handle the abort signal from the options parameter, causing the externally bound controller's signal to have no effect.

System Info

@langchain/baidu-qianfan@0.1.0 | MIT | deps: 4 | versions: 5
Node v22.11.0
pnpm 9.9.0

Hey there, @stanoswald! 👋 I'm here to help you with bugs, questions, and becoming a contributor. Let's squash those bugs together!

I found a similar open issue related to abort signals in the LangChain JS codebase. The issue discusses how the AgentExecutor class does not currently support the use of an AbortController signal in its invoke method. However, it mentions that the AgentExecutor class has a mechanism to handle abort signals through the earlyStoppingMethod property and the _stop method within the AgentExecutorIterator class [1].

Regarding the Qianfan chat models, the _streamResponseChunks function does handle abort signals by passing the abortSignal from the options to the client.send method. This should allow the request to be aborted if needed [2]. If the abort signal is not working as expected, it might be worth checking if the signal is correctly passed and if there are any specific conditions in your environment that could affect its behavior.

To continue talking to Dosu, mention @dosu.

I could submit a PR for this, but would appreciate if the Qianfan team devs could first take a look at this and provide a fix.
@jacoblee93 @dl102306