bug: AttributeError: 'ChatGLMForConditionalGeneration' object has no attribute 'encoder'
Closed this issue · 4 comments
arugal commented
Logs
Namespace(model='THUDM/chatglm-6b', emb_model='sentence-transformers/all-MiniLM-L6-v2', dry_run=False, device='auto', port=8000, worker=1)
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 8/8 [00:04<00:00, 1.70it/s]
INFO: Started server process [10159]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
2023-06-05 03:40:49,960 - 10159 - ERROR - app.py:1047 - [FALCON] Unhandled exception in ASGI app
Traceback (most recent call last):
File "/opt/conda/envs/envd/lib/python3.9/site-packages/falcon/asgi/app.py", line 406, in __call__
await responder(req, resp, **params)
File "/home/envd/modelz-llm/src/modelz_llm/falcon_service.py", line 284, in on_post
for comp in self.model.step_generate(chat_req):
File "/home/envd/modelz-llm/src/modelz_llm/falcon_service.py", line 125, in step_generate
encoder_output = self.model.encoder(
File "/opt/conda/envs/envd/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1614, in __getattr__
raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'ChatGLMForConditionalGeneration' object has no attribute 'encoder'
INFO: 192.168.2.105:46086 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
Hi community, I got AttributeError: 'ChatGLMForConditionalGeneration' object has no attribute 'encoder'
after starting with chatglm-6b
kemingy commented
Which version of llmspec
and modelz-llm
are you using?
arugal commented
Which version of
llmspec
andmodelz-llm
are you using?
llmspec==0.3.8
modelz-llm==23.6.5