The cmake .. Command Failed
NickDeBeenSAE opened this issue · 5 comments
NickDeBeenSAE commented
The above happened because you didn't specify what to build inside of the build directory of gpt4all-backend as per following the repositories instructions.
cmake .. CMake Warning: Ignoring extra path from command line: ".." CMake Error: The source directory "/home/kali/gpt4all" does not appear to contain CMakeLists.txt. Specify --help for usage, or press the help button on the CMake GUI.
NickDeBeenSAE commented
Ok, never mind, it worked the second time around via copy and paste.
NickDeBeenSAE commented
I'll keep you updated.
NickDeBeenSAE commented
Ok, here's that update:
poetry install --directory /home/nickdebeen/Downloads/gpt4all Installing dependencies from lock file
NickDeBeenSAE commented
As you can see, its not listing anything.
Therefore, Poetry itself has a bug.
NickDeBeenSAE commented
cmake .. -- Interprocedural optimization support detected -- CMAKE_SYSTEM_PROCESSOR: x86_64 -- Configuring ggml implementation target llama-mainline-default in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-mainline -- x86 detected -- Configuring ggml implementation target llama-230511-default in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-230511 -- x86 detected -- Configuring ggml implementation target llama-230519-default in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-230519 -- x86 detected -- Configuring model implementation target llamamodel-mainline-default -- Configuring model implementation target replit-mainline-default -- Configuring model implementation target llamamodel-230519-default -- Configuring model implementation target llamamodel-230511-default -- Configuring model implementation target gptj-default -- Configuring model implementation target falcon-default -- Configuring model implementation target mpt-default -- Configuring model implementation target bert-default -- Configuring model implementation target starcoder-default -- Configuring ggml implementation target llama-mainline-avxonly in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-mainline -- x86 detected -- Configuring ggml implementation target llama-230511-avxonly in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-230511 -- x86 detected -- Configuring ggml implementation target llama-230519-avxonly in /home/nickdebeen/gpt4all/gpt4all-backend/llama.cpp-230519 -- x86 detected -- Configuring model implementation target llamamodel-mainline-avxonly -- Configuring model implementation target replit-mainline-avxonly -- Configuring model implementation target llamamodel-230519-avxonly -- Configuring model implementation target llamamodel-230511-avxonly -- Configuring model implementation target gptj-avxonly -- Configuring model implementation target falcon-avxonly -- Configuring model implementation target mpt-avxonly -- Configuring model implementation target bert-avxonly -- Configuring model implementation target starcoder-avxonly -- Configuring done (0.3s) CMake Error at llama.cpp.cmake:296 (add_library): Cannot find source file: llama.cpp-mainline/ggml.c Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90 .f95 .f03 .hip .ispc Call Stack (most recent call first): CMakeLists.txt:71 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): Cannot find source file: llama.cpp-mainline/llama.cpp Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90 .f95 .f03 .hip .ispc Call Stack (most recent call first): CMakeLists.txt:71 (include_ggml) CMake Error at llama.cpp.cmake:296 (add_library): Cannot find source file: llama.cpp-230511/ggml.c Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90 .f95 .f03 .hip .ispc Call Stack (most recent call first): CMakeLists.txt:74 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): Cannot find source file: llama.cpp-230511/llama.cpp Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90 .f95 .f03 .hip .ispc Call Stack (most recent call first): CMakeLists.txt:74 (include_ggml) CMake Error at llama.cpp.cmake:296 (add_library): Cannot find source file: llama.cpp-230519/ggml.c Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90 .f95 .f03 .hip .ispc Call Stack (most recent call first): CMakeLists.txt:75 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): Cannot find source file: llama.cpp-230519/llama.cpp Tried extensions .c .C .c++ .cc .cpp .cxx .cu .mpp .m .M .mm .ixx .cppm .ccm .cxxm .c++m .h .hh .h++ .hm .hpp .hxx .in .txx .f .F .for .f77 .f90 .f95 .f03 .hip .ispc Call Stack (most recent call first): CMakeLists.txt:75 (include_ggml) CMake Error at llama.cpp.cmake:296 (add_library): No SOURCES given to target: ggml-mainline-default Call Stack (most recent call first): CMakeLists.txt:71 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): No SOURCES given to target: llama-mainline-default Call Stack (most recent call first): CMakeLists.txt:71 (include_ggml) CMake Error at llama.cpp.cmake:296 (add_library): No SOURCES given to target: ggml-230511-default Call Stack (most recent call first): CMakeLists.txt:74 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): No SOURCES given to target: llama-230511-default Call Stack (most recent call first): CMakeLists.txt:74 (include_ggml) CMake Error at llama.cpp.cmake:296 (add_library): No SOURCES given to target: ggml-230519-default Call Stack (most recent call first): CMakeLists.txt:75 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): No SOURCES given to target: llama-230519-default Call Stack (most recent call first): CMakeLists.txt:75 (include_ggml) CMake Error at llama.cpp.cmake:296 (add_library): No SOURCES given to target: ggml-mainline-avxonly Call Stack (most recent call first): CMakeLists.txt:71 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): No SOURCES given to target: llama-mainline-avxonly Call Stack (most recent call first): CMakeLists.txt:71 (include_ggml) CMake Error at llama.cpp.cmake:296 (add_library): No SOURCES given to target: ggml-230511-avxonly Call Stack (most recent call first): CMakeLists.txt:74 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): No SOURCES given to target: llama-230511-avxonly Call Stack (most recent call first): CMakeLists.txt:74 (include_ggml) CMake Error at llama.cpp.cmake:296 (add_library): No SOURCES given to target: ggml-230519-avxonly Call Stack (most recent call first): CMakeLists.txt:75 (include_ggml) CMake Error at llama.cpp.cmake:325 (add_library): No SOURCES given to target: llama-230519-avxonly Call Stack (most recent call first): CMakeLists.txt:75 (include_ggml) CMake Generate step failed. Build files cannot be regenerated correctly.