xusenlinzy/api-for-open-llm

"POST /v1/files HTTP/1.1" 404 Not Found

KEAI404 opened this issue · 1 comments

提交前必须检查以下项目 | The following items must be checked before submission

  • 请确保使用的是仓库最新代码(git pull),一些问题已被解决和修复。 | Make sure you are using the latest code from the repository (git pull), some issues have already been addressed and fixed.
  • 我已阅读项目文档FAQ章节并且已在Issue中对问题进行了搜索,没有找到相似问题和解决方案 | I have searched the existing issues / discussions

问题类型 | Type of problem

模型推理和部署 | Model inference and deployment

操作系统 | Operating system

Windows

详细描述问题 | Detailed description of the problem

在使用doc_chat 时报错

upf = client.files.create(file=open(filepath, "rb"), purpose="assistants")

Traceback (most recent call last):
File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script
exec(code, module.dict)
File "E:\streamlit-demo\streamlit_app.py", line 67, in
main()
File "E:\streamlit-demo\streamlit_app.py", line 62, in main
page.show()
File "E:\streamlit-demo\streamlit_gallery\utils\page.py", line 49, in show
self._selected()
File "E:\streamlit-demo\streamlit_gallery\components\doc_chat\streamlit_app.py", line 110, in main
create_file_index(
File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 165, in wrapper
return cached_func(*args, **kwargs)
File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 194, in call
return self._get_or_create_cached_value(args, kwargs)
File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 221, in _get_or_create_cached_value
return self._handle_cache_miss(cache, value_key, func_args, func_kwargs)
File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 277, in _handle_cache_miss
computed_value = self._info.func(*func_args, **func_kwargs)
File "E:\streamlit-demo\streamlit_gallery\components\doc_chat\streamlit_app.py", line 48, in create_file_index file_id = server.upload(
File "E:\streamlit-demo\streamlit_gallery\components\doc_chat\utils.py", line 73, in upload
upf = self.client.files.create(file=open(filepath, "rb"), purpose="assistants")
File "C:\Users\me.conda\envs\lib\site-packages\openai\resources\files.py", line 113, in create
return self._post(
File "C:\Users\me.conda\envs\lib\site-packages\openai_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "C:\Users\me.conda\envs\lib\site-packages\openai_base_client.py", line 921, in request
return self._request(
File "C:\Users\me.conda\envs\lib\site-packages\openai_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'detail': 'Not Found'}

Dependencies

# 请在此处粘贴依赖情况
# Please paste the dependencies here

运行日志或截图 | Runtime logs or screenshots

No response

看看启动模型的时候环境变量是不是TASKS=llm,rag