iaalm/llama-api-server

instantiation failed: unknown import, Code: 0x62

njalan opened this issue · 3 comments

Here is my commnad:
wasmedge --dir .:. --nn-preload default:GGML:AUTO:qwen1_5-14b-chat-q5_k_m.gguf llama-api-server.wasm -p chatml

[2024-03-21 22:59:10.566] [error] instantiation failed: unknown import, Code: 0x62
[2024-03-21 22:59:10.566] [error] When linking module: "rustls_client" , function name: "new_codec"
[2024-03-21 22:59:10.566] [error] At AST node: import description
[2024-03-21 22:59:10.566] [error] At AST node: import section
[2024-03-21 22:59:10.566] [error] At AST node: module

Thanks for the feedback @njalan ! I didn't try this module with wasmedge. Could you provider detail reproduce steps so that I can look into it? How do you generate llama-api-server.wasm?

Seems you opened to wrong repo. Should be https://github.com/LlamaEdge/LlamaEdge