tensorchord/modelz-llm

bug: 500 with langchain sdk

Closed this issue · 2 comments

2023-05-23 03:48:17,823 - ERROR - app.py:1047 - [FALCON] Unhandled exception in ASGI app
Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/falcon/asgi/app.py", line 451, in __call__
    resp._media_rendered = serialize_sync(resp._media)
  File "falcon/media/json.py", line 179, in falcon.media.json.JSONHandler._serialize_s
  File "/opt/conda/lib/python3.10/json/__init__.py", line 238, in dumps
    **kw).encode(obj)
  File "/opt/conda/lib/python3.10/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/opt/conda/lib/python3.10/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "/opt/conda/lib/python3.10/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type ValidationError is not JSON serializable

Reproduction:

export OPENAI_API_KEY="any"
export OPENAI_API_BASE="http://xx"
from langchain.llms import OpenAI

llm = OpenAI()

text = "What would be a good company name for a company that makes colorful socks?"
print(llm(text))

Seems we also need to define the ErrorResponse.

Yeah, I think so.