Local intall failure
plageon opened this issue · 3 comments
System Info
Full log:
Installing text-embeddings-router v1.5.0 (/data_train/search/InternData/jiejuntan/python/text-embeddings-inference/router)
Updating git repository https://github.com/OlivierDehaene/candle
Updating git repository https://github.com/coreylowman/cudarc
Updating ustc
index
Updating git repository https://github.com/huggingface/candle-cublaslt
Updating git repository https://github.com/huggingface/candle-layer-norm
Updating git repository https://github.com/huggingface/candle-rotary
Compiling aws-lc-sys v0.20.1
Compiling candle-kernels v0.5.0 (https://github.com/OlivierDehaene/candle?rev=7e02ad856104799b73a946ac1e153f0de77feaaf#7e02ad85)
Compiling candle-flash-attn v0.5.0 (https://github.com/OlivierDehaene/candle?rev=7e02ad856104799b73a946ac1e153f0de77feaaf#7e02ad85)
Compiling candle-rotary v0.0.1 (https://github.com/huggingface/candle-rotary?rev=0a718a0856569a92f3112e64f10d07e4447822e8#0a718a08)
Compiling candle-layer-norm v0.0.1 (https://github.com/huggingface/candle-layer-norm?rev=94c2add7d94c2d63aebde77f7534614e04dbaea1#94c2add7)
error: failed to run custom build command for candle-kernels v0.5.0 (https://github.com/OlivierDehaene/candle?rev=7e02ad856104799b73a946ac1e153f0de77feaaf#7e02ad85)
note: To improve backtraces for build dependencies, set the CARGO_PROFILE_RELEASE_BUILD_OVERRIDE_DEBUG=true environment variable to enable debug information generation.
Caused by:
process didn't exit successfully: /data_train/search/InternData/jiejuntan/python/text-embeddings-inference/target/release/build/candle-kernels-96b054778005e69c/build-script-build
(exit status: 101)
--- stdout
cargo:rerun-if-changed=build.rs
cargo:rerun-if-changed=src/compatibility.cuh
cargo:rerun-if-changed=src/cuda_utils.cuh
cargo:rerun-if-changed=src/binary_op_macros.cuh
cargo:info=["/usr", "/usr/local/cuda", "/opt/cuda", "/usr/lib/cuda", "C:/Program Files/NVIDIA GPU Computing Toolkit", "C:/CUDA"]
cargo:rerun-if-env-changed=CUDA_COMPUTE_CAP
--- stderr
thread 'main' panicked at /data_train/search/InternData/jiejuntan/.cargo/registry/src/mirrors.ustc.edu.cn-61ef6e0cd06fb9b8/bindgen_cuda-0.1.5/src/lib.rs:492:9:
assertion left == right
failed
left: "Field "compute_cap" is not a valid field to query."
right: "compute_cap"
stack backtrace:
0: 0x56275171c0cc -
1: 0x5627517437c0 -
2: 0x562751718cff -
3: 0x56275171beb4 -
4: 0x56275171d6e7 -
5: 0x56275171d44f -
6: 0x56275171db68 -
7: 0x56275171da4e -
8: 0x56275171c596 -
9: 0x56275171d7b2 -
10: 0x562751684375 -
11: 0x56275168469b -
12: 0x562751697b5a -
13: 0x56275168f0e8 -
14: 0x56275168b9f0 -
15: 0x56275168607e -
16: 0x562751685903 -
17: 0x562751685166 -
18: 0x562751685519 -
19: 0x562751713927 -
20: 0x5627516854f7 -
21: 0x562751686285 -
22: 0x7f4e48d9a083 - __libc_start_main
at /build/glibc-SzIz7B/glibc-2.31/csu/../csu/libc-start.c:308:16
23: 0x562751684ade -
24: 0x0 -
warning: build failed, waiting for other jobs to finish...
error: failed to run custom build command for candle-flash-attn v0.5.0 (https://github.com/OlivierDehaene/candle?rev=7e02ad856104799b73a946ac1e153f0de77feaaf#7e02ad85)
note: To improve backtraces for build dependencies, set the CARGO_PROFILE_RELEASE_BUILD_OVERRIDE_DEBUG=true environment variable to enable debug information generation.
Caused by:
process didn't exit successfully: /data_train/search/InternData/jiejuntan/python/text-embeddings-inference/target/release/build/candle-flash-attn-8a95a3aec6b03d81/build-script-build
(exit status: 101)
--- stdout
cargo:rerun-if-changed=build.rs
cargo:rerun-if-changed=kernels/flash_api.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim128_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim160_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim192_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim224_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim256_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim32_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim64_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim96_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim128_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim160_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim192_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim224_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim256_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim32_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim64_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_hdim96_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim32_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim32_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim64_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim64_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim96_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim96_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim128_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim128_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim160_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim160_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim192_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim192_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim224_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim224_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim256_bf16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_split_hdim256_fp16_sm80.cu
cargo:rerun-if-changed=kernels/flash_fwd_kernel.h
cargo:rerun-if-changed=kernels/flash_fwd_launch_template.h
cargo:rerun-if-changed=kernels/flash.h
cargo:rerun-if-changed=kernels/philox.cuh
cargo:rerun-if-changed=kernels/softmax.h
cargo:rerun-if-changed=kernels/utils.h
cargo:rerun-if-changed=kernels/kernel_traits.h
cargo:rerun-if-changed=kernels/block_info.h
cargo:rerun-if-changed=kernels/static_switch.h
cargo:info=["/usr", "/usr/local/cuda", "/opt/cuda", "/usr/lib/cuda", "C:/Program Files/NVIDIA GPU Computing Toolkit", "C:/CUDA"]
cargo:rerun-if-env-changed=CUDA_COMPUTE_CAP
--- stderr
thread 'main' panicked at /data_train/search/InternData/jiejuntan/.cargo/registry/src/mirrors.ustc.edu.cn-61ef6e0cd06fb9b8/bindgen_cuda-0.1.5/src/lib.rs:492:9:
assertion left == right
failed
left: "Field "compute_cap" is not a valid field to query."
right: "compute_cap"
stack backtrace:
0: 0x55e2addf9c5c -
1: 0x55e2ade205c0 -
2: 0x55e2addf6f3f -
3: 0x55e2addf9a44 -
4: 0x55e2addfb277 -
5: 0x55e2addfafdf -
6: 0x55e2addfb6f8 -
7: 0x55e2addfb5de -
8: 0x55e2addfa126 -
9: 0x55e2addfb342 -
10: 0x55e2add5e785 -
11: 0x55e2add5ea6b -
12: 0x55e2add775fa -
13: 0x55e2add74678 -
14: 0x55e2add73600 -
15: 0x55e2add65578 -
16: 0x55e2add6b763 -
17: 0x55e2add6e3b6 -
18: 0x55e2add62069 -
19: 0x55e2addf0e27 -
20: 0x55e2add62047 -
21: 0x55e2add65b25 -
22: 0x7f9a56191083 - __libc_start_main
at /build/glibc-SzIz7B/glibc-2.31/csu/../csu/libc-start.c:308:16
23: 0x55e2add5eeee -
24: 0x0 -
error: failed to run custom build command for candle-rotary v0.0.1 (https://github.com/huggingface/candle-rotary?rev=0a718a0856569a92f3112e64f10d07e4447822e8#0a718a08)
note: To improve backtraces for build dependencies, set the CARGO_PROFILE_RELEASE_BUILD_OVERRIDE_DEBUG=true environment variable to enable debug information generation.
Caused by:
process didn't exit successfully: /data_train/search/InternData/jiejuntan/python/text-embeddings-inference/target/release/build/candle-rotary-497455705eb33e7d/build-script-build
(exit status: 101)
--- stdout
cargo:rerun-if-changed=build.rs
cargo:rerun-if-changed=kernels/rotary.cu
cargo:info=["/usr", "/usr/local/cuda", "/opt/cuda", "/usr/lib/cuda", "C:/Program Files/NVIDIA GPU Computing Toolkit", "C:/CUDA"]
cargo:rerun-if-env-changed=CUDA_COMPUTE_CAP
--- stderr
thread 'main' panicked at /data_train/search/InternData/jiejuntan/.cargo/registry/src/mirrors.ustc.edu.cn-61ef6e0cd06fb9b8/bindgen_cuda-0.1.5/src/lib.rs:492:9:
assertion left == right
failed
left: "Field "compute_cap" is not a valid field to query."
right: "compute_cap"
stack backtrace:
0: 0x5560d500daec -
1: 0x5560d5034450 -
2: 0x5560d500adcf -
3: 0x5560d500d8d4 -
4: 0x5560d500f107 -
5: 0x5560d500ee6f -
6: 0x5560d500f588 -
7: 0x5560d500f46e -
8: 0x5560d500dfb6 -
9: 0x5560d500f1d2 -
10: 0x5560d4f72785 -
11: 0x5560d4f72a6b -
12: 0x5560d4f8b48a -
13: 0x5560d4f88508 -
14: 0x5560d4f87490 -
15: 0x5560d4f84259 -
16: 0x5560d4f760e3 -
17: 0x5560d4f80d86 -
18: 0x5560d4f7db89 -
19: 0x5560d5004cb7 -
20: 0x5560d4f7db67 -
21: 0x5560d4f84845 -
22: 0x7ff4393d6083 - __libc_start_main
at /build/glibc-SzIz7B/glibc-2.31/csu/../csu/libc-start.c:308:16
23: 0x5560d4f72eee -
24: 0x0 -
error: failed to run custom build command for candle-layer-norm v0.0.1 (https://github.com/huggingface/candle-layer-norm?rev=94c2add7d94c2d63aebde77f7534614e04dbaea1#94c2add7)
note: To improve backtraces for build dependencies, set the CARGO_PROFILE_RELEASE_BUILD_OVERRIDE_DEBUG=true environment variable to enable debug information generation.
Caused by:
process didn't exit successfully: /data_train/search/InternData/jiejuntan/python/text-embeddings-inference/target/release/build/candle-layer-norm-bc6dff3c6ea9a5f3/build-script-build
(exit status: 101)
--- stdout
cargo:rerun-if-changed=build.rs
cargo:rerun-if-changed=kernels/ln_api.cu
cargo:rerun-if-changed=kernels/**.cu
cargo:rerun-if-changed=kernels/ln_fwd_kernels.cuh
cargo:rerun-if-changed=kernels/ln_kernel_traits.h
cargo:rerun-if-changed=kernels/ln_utils.cuh
cargo:rerun-if-changed=kernels/static_switch.h
cargo:rustc-env=CUDA_INCLUDE_DIR=/usr/local/cuda/include
cargo:rerun-if-env-changed=CANDLE_NVCC_CCBIN
cargo:rerun-if-env-changed=CUDA_COMPUTE_CAP
--- stderr
thread 'main' panicked at /data_train/search/InternData/jiejuntan/.cargo/git/checkouts/candle-layer-norm-b72671467f083485/94c2add/build.rs:205:9:
assertion left == right
failed
left: "Field "compute_cap" is not a valid field to query."
right: "compute_cap"
stack backtrace:
0: 0x5569cfecc3ac -
1: 0x5569cfef2d40 -
2: 0x5569cfec971f -
3: 0x5569cfecc194 -
4: 0x5569cfecd9c7 -
5: 0x5569cfecd72f -
6: 0x5569cfecde48 -
7: 0x5569cfecdd2e -
8: 0x5569cfecc876 -
9: 0x5569cfecda92 -
10: 0x5569cfe45795 -
11: 0x5569cfe45a7b -
12: 0x5569cfe5790a -
13: 0x5569cfe5c8be -
14: 0x5569cfe59931 -
15: 0x5569cfe50513 -
16: 0x5569cfe54506 -
17: 0x5569cfe4ca29 -
18: 0x5569cfec35d7 -
19: 0x5569cfe4ca07 -
20: 0x5569cfe5d5c5 -
21: 0x7f73345b0083 - __libc_start_main
at /build/glibc-SzIz7B/glibc-2.31/csu/../csu/libc-start.c:308:16
22: 0x5569cfe45efe -
23: 0x0 -
error: failed to run custom build command for aws-lc-sys v0.20.1
note: To improve backtraces for build dependencies, set the CARGO_PROFILE_RELEASE_BUILD_OVERRIDE_DEBUG=true environment variable to enable debug information generation.
Caused by:
process didn't exit successfully: /data_train/search/InternData/jiejuntan/python/text-embeddings-inference/target/release/build/aws-lc-sys-ff04557cafce9ac3/build-script-main
(exit status: 101)
--- stdout
cargo:rerun-if-env-changed=AWS_LC_SYS_NO_PREFIX
cargo:rerun-if-env-changed=AWS_LC_SYS_INTERNAL_BINDGEN
cargo:rerun-if-env-changed=AWS_LC_SYS_EXTERNAL_BINDGEN
cargo:rerun-if-env-changed=AWS_LC_SYS_NO_ASM
cargo:rustc-cfg=x86_64_unknown_linux_gnu
cargo:rerun-if-env-changed=AWS_LC_SYS_CMAKE_BUILDER
cargo:rerun-if-env-changed=AWS_LC_SYS_STATIC
default_for Target: 'x86_64-unknown-linux-gnu'
cargo:rerun-if-env-changed=CARGO_FEATURE_SSL
default_for Target: 'x86_64-unknown-linux-gnu'
cargo:rerun-if-env-changed=CARGO_FEATURE_SSL
cargo:root=/data_train/search/InternData/jiejuntan/python/text-embeddings-inference/target/release/build/aws-lc-sys-9935b047aa3bc2af/out
default_for Target: 'x86_64-unknown-linux-gnu'
OPT_LEVEL = Some(3)
TARGET = Some(x86_64-unknown-linux-gnu)
OUT_DIR = Some(/data_train/search/InternData/jiejuntan/python/text-embeddings-inference/target/release/build/aws-lc-sys-9935b047aa3bc2af/out)
HOST = Some(x86_64-unknown-linux-gnu)
cargo:rerun-if-env-changed=CC_x86_64-unknown-linux-gnu
CC_x86_64-unknown-linux-gnu = None
cargo:rerun-if-env-changed=CC_x86_64_unknown_linux_gnu
CC_x86_64_unknown_linux_gnu = None
cargo:rerun-if-env-changed=HOST_CC
HOST_CC = None
cargo:rerun-if-env-changed=CC
CC = None
cargo:rerun-if-env-changed=CC_ENABLE_DEBUG_OUTPUT
RUSTC_WRAPPER = None
cargo:rerun-if-env-changed=CRATE_CC_NO_DEFAULTS
CRATE_CC_NO_DEFAULTS = None
DEBUG = Some(false)
CARGO_CFG_TARGET_FEATURE = Some(adx,aes,avx,avx2,bmi1,bmi2,cmpxchg16b,f16c,fma,fxsr,lzcnt,movbe,pclmulqdq,popcnt,rdrand,rdseed,sha,sse,sse2,sse3,sse4.1,sse4.2,ssse3,xsave,xsavec,xsaveopt,xsaves)
cargo:rerun-if-env-changed=CFLAGS_x86_64-unknown-linux-gnu
CFLAGS_x86_64-unknown-linux-gnu = None
cargo:rerun-if-env-changed=CFLAGS_x86_64_unknown_linux_gnu
CFLAGS_x86_64_unknown_linux_gnu = None
cargo:rerun-if-env-changed=HOST_CFLAGS
HOST_CFLAGS = None
cargo:rerun-if-env-changed=CFLAGS
CFLAGS = None
--- stderr
thread 'main' panicked at /data_train/search/InternData/jiejuntan/.cargo/registry/src/mirrors.ustc.edu.cn-61ef6e0cd06fb9b8/aws-lc-sys-0.20.1/builder/cc_builder.rs:244:13:
Your compiler (cc) is not supported due to a memcmp related bug reported in https://gcc.gnu.org/bugzilla/show_bug.cgi?id=95189.We strongly recommend against using this compiler.EXECUTED: true ERROR: OUTPUT:
stack backtrace:
0: 0x562caa9ec6bc -
1: 0x562caaa14ff0 -
2: 0x562caa9e968f -
3: 0x562caa9ec4a4 -
4: 0x562caa9ee2d7 -
5: 0x562caa9ee03f -
6: 0x562caa9ee758 -
7: 0x562caa9ee63e -
8: 0x562caa9ecb86 -
9: 0x562caa9ee3a2 -
10: 0x562caa905ea5 -
11: 0x562caa90bba9 -
12: 0x562caa90bcce -
13: 0x562caa90a492 -
14: 0x562caa90c1f5 -
15: 0x562caa9226e1 -
16: 0x562caa910e53 -
17: 0x562caa917c96 -
18: 0x562caa910639 -
19: 0x562caa9e3e47 -
20: 0x562caa910617 -
21: 0x562caa924ef5 -
22: 0x7fe8d680d083 - __libc_start_main
at /build/glibc-SzIz7B/glibc-2.31/csu/../csu/libc-start.c:308:16
23: 0x562caa90660e -
24: 0x0 -
error: failed to compile text-embeddings-router v1.5.0 (/data_train/search/InternData/jiejuntan/python/text-embeddings-inference/router)
, intermediate artifacts can be found at /data_train/search/InternData/jiejuntan/python/text-embeddings-inference/target
.
To reuse those artifacts with a future compilation, set the environment variable CARGO_TARGET_DIR
to that path.
Information
- Docker
- The CLI directly
Tasks
- An officially supported command
- My own modifications
Reproduction
- Using A100, CUDA version Build cuda_12.1.r12.1/compiler.32415258_0
- RUST_BACKTRACE=full cargo install --path router -F candle-cuda -F http --no-default-features
Expected behavior
Build sucessfully
I also have the same problem, did you work out?
no, I use docker instead.