marklysze/LlamaIndex-RAG-WSL-CUDA

when I import llama-index,arise problem 'no module named openai.openai_object'

Closed this issue · 4 comments

Hi @cementwise, looking on llamaindex's repo, seems like it may be a python 3.12+ issue. If you're not on 3.12 or above let me know.

Llamaindex issue:
run-llama/llama_index#9292

This repo has an conda environment with 3.11.

The issue has been rectified in llamaindex so I'll update this repo to python 3.12. I'll do that shortly and update the issue.

Thanks for logging the issue.

@cementwise, unfortunately pytorch is not compatible with python 3.12. Track issue here

They've put it into a pytorch v2 milestone which has a Jan 31 target date. When it is available I can try again to update the repository to python 3.12.

For now, you may need to create a new conda environment with python 3.11:
conda create -n LlamaIndexGeneric python=3.11

Then follow through with installation of the other pip packages, CUDA, etc.

I'll close this issue shortly.

Closing now. If pytorch v2 is released and supports Python 3.12 I can update the notebooks then.

I'm seeing ModuleNotFoundError: No module named 'openai.openai_object' even with Python version 3.11.5, so this issue does not appear to be restricted to Python 3.12

image