abetlen/llama-cpp-python

Update related llama.cpp to support Intel AMX instruction

Closed this issue · 1 comments

Thank you for this great project!
I find that the base llama.cpp is linked to vendor/llama.cpp. It seems the latest llama-cpp-python v0.3.1 is linked to llama.cpp with two months old.

Last month, llama.cpp supported for Intel AMX extension instructions, which is expected to provide significant performance improvements on some Intel CPUs.
ggerganov/llama.cpp#8998

Therefore, I hope you to release a llama-cpp-python with the latest llama.cpp link.

It seems the Ver3.2 with updated llama.cpp has been released.
7ecdd94

Close this issue.