Dependency llama-cpp in PR #77 won't build on M chip Mac
xdotli opened this issue · 3 comments
xdotli commented
What is the issue?
Running
make first-run
on mac will trigger the following command:
poetry install --with setup,community --verbose
Then it will produce this error
RuntimeError: Failed to load shared library '/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib': dlopen(/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib, 0x0006): tried: '/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (no such file), '/Users/lixiangyi/Library/Caches/pypoetry/virtualenvs/backend-PXYn0t1E-py3.11/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64'))
make[1]: *** [setup] Error 1
make: *** [first-run] Error 2
The llama-cpp-python library is introduced in PR #77, I have two proposals to help easing the set up:
- I have created a devcontainer config for developers to run in GitHub Codespace
- Remove the deps on llama-cpp all together.
Additional information
No response
tianjing-li commented
@xdotli I have an M1 as well with my setup and I'm able to run the poetry setup, perhaps try deleting the .venv file poetry generates and trying it again?
tianjing-li commented
This should be resolved as of now
xdotli commented
This should be resolved as of now
Oh thanks Tianjing! I manually removed the llama cpp deps on the Dev Day and it worked. Sorry for the late reply! I just remembered this issue when I was applying for Cohere fall internship just now.