langchain-ai/chat-langchain

Getting output twice please someone help me. Please check the screenshot of answer. Thanks to all

MuhammadNaeem42 opened this issue · 4 comments

Same issue here. Downgrade version langchain, it will solves the problem for the time being.

Core issue in ChatWindow.tsx
import { RemoteRunnable } from "langchain/runnables/remote";
import { applyPatch } from "@langchain/core/utils/json_patch";
(working versions)
langchain@0.1.26
@langchain/core@0.1.44

@iiitmahesh Won't compile using
import { RemoteRunnable } from "langchain/runnables/remote";
in conda python. Besides, do you know which would be the python version to downgrade to? 0.1.26 is the js one.

thanks! @iiitmahesh langchain 0.1.16 langchain-core 0.1.44 and applying this: https://github.com/langchain-ai/chat-langchain/pull/301/files/87f09408e685b76e5a146184a81a84f96cfcc603 finally worked for me!