Azure_langchain_mistral_ai JSONDecodeError
amraghu1 opened this issue · 2 comments
Operating System
Windows
Version Information
Python 3.8.2 / 3.11.9 (tested with both)
langchain-mistralai : 0.1.1
langchain : 0.1.16
Steps to reproduce
@santiagxf thank you for sharing the sample code.
https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/langchain.ipynb
Trying to run the above example but I am running JSONDecodeError
Code:
from langchain.chains import LLMChain
from langchain.memory import ConversationBufferMemory
from langchain.prompts import (
ChatPromptTemplate,
HumanMessagePromptTemplate,
MessagesPlaceholder,
)
from langchain.schema import SystemMessage
from langchain_mistralai.chat_models import ChatMistralAI
prompt = ChatPromptTemplate(
messages= [
SystemMessage(
content="You are a chatbot having a conversation with a human. You love making references to french culture on your answers."
),
MessagesPlaceholder(variable_name="chat_history"),
HumanMessagePromptTemplate.from_template("{human_input}"),
],
input_variables= ["human_input", "chat_history"]
)
memory = ConversationBufferMemory(memory_key="chat_history",return_messages=True)
chat_model = ChatMistralAI(
endpoint="<azureaiendpoint_formistrallarge>",
mistral_api_key="",
)
chat_llm_chain = LLMChain(
llm=chat_model,
prompt=prompt,
memory=memory,
verbose=True,
)
print(chat_llm_chain)
chat_llm_chain.predict(human_input="Hi there my friend")
chat_llm_chain.predict(human_input="I'm thinking on a present for my mother. Any advise?")
Expected behavior
Should give me output as described in https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/langchain.ipynb
Actual behavior
JSONDecodeError:
Trying to run this example but I am getting the below error:
Entering new LLMChain chain...
Prompt after formatting:
System: You are a chatbot having a conversation with a human. You love making references to french culture on your answers.
Human: Hi there my friend
Traceback (most recent call last):
File "azureai_mistral_test.py", line 57, in
chat_llm_chain.predict(human_input="Hi there my friend")
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain\chains\llm.py", line 293, in predict
return self(kwargs, callbacks=callbacks)[self.output_key]
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain_core_api\deprecation.py", line 145, in warning_emitting_wrapper
return wrapped(*args, **kwargs)
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain\chains\base.py", line 378, in call
return self.invoke(
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain\chains\base.py", line 163, in invoke
raise e
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain\chains\base.py", line 153, in invoke
self._call(inputs, run_manager=run_manager)
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain\chains\llm.py", line 103, in _call
response = self.generate([inputs], run_manager=run_manager)
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain\chains\llm.py", line 115, in generate
return self.llm.generate_prompt(
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain_core\language_models\chat_models.py", line 556, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain_core\language_models\chat_models.py", line 417, in generate
raise e
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain_core\language_models\chat_models.py", line 407, in generate
self._generate_with_cache(
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain_core\language_models\chat_models.py", line 626, in _generate_with_cache
result = self._generate(
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain_mistralai\chat_models.py", line 331, in _generate
response = self.completion_with_retry(
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain_mistralai\chat_models.py", line 256, in completion_with_retry
rtn = _completion_with_retry(**kwargs)
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\langchain_mistralai\chat_models.py", line 254, in completion_with_retry
return self.client.post(url="/chat/completions", json=kwargs).json()
File "C:\Users\username\AppData\Roaming\Python\Python38\site-packages\httpx_models.py", line 764, in json
return jsonlib.loads(self.content, **kwargs)
File "C:\Program Files\Python38\lib\json_init.py", line 357, in loads
return _default_decoder.decode(s)
File "C:\Program Files\Python38\lib\json\decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Program Files\Python38\lib\json\decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
Addition information
No response
Hi @amraghu1. Thanks for reaching out. Unfortunately, we can't reproduce the issue but we noticed the libraries may be outdated. Please ensure you have:
langchain==0.1.9
langchain-mistralai==0.0.5
Please let us know if upgrading your environment solves the issue.
Thank you @santiagxf for your quick response! Upgrading to latest versions helped resolve the issue!!