intel/intel-extension-for-transformers
⚡ Build your chatbot within minutes on your favorite device; offer SOTA compression techniques for LLMs; run LLMs efficiently on Intel Platforms⚡
PythonApache-2.0
Issues
- 6
Cannot finish FP4 quantization: `RuntimeError: Qbits: only support Integer WOQ in PACKQ`
#1577 opened by PhzCode - 3
Whether FP4 inference is supported
#1582 opened by PhzCode - 5
- 5
plugin init failed
#1377 opened by anayjain - 7
Fails to load saved model : Trying to set a tensor of shape torch.Size([1376, 4096]) in "qweight" (which has shape torch.Size([4096, 1376])), this look incorrect.
#1407 opened by kranipa - 4
- 17
Cannot run llama3 8b instruct: `AssertionError: Fail to convert pytorch model`
#1522 opened by N3RDIUM - 1
qloracpu fails, need a conda env list
#1561 opened by Lix1993 - 2
(detailed) conda install instructions?
#1550 opened by hpcpony - 5
unable to start talkingbot frondend
#1517 opened by raj-ritu17 - 2
AssertionError: Fail to convert pytorch model with 'Intel/neural-chat-7b-v3-3' WOQ
#1357 opened by eduand-alvarez - 4
- 1
rag plugin initialize failed
#1538 opened by redhairerINTEL - 3
neuralchat /v1/askdoc/create 404 not found. Failed to call this api on ubuntu system.
#1533 opened by RongLei-intel - 6
- 1
- 2
pip install failure on python3.10-alpine image
#1379 opened by lrrountr - 3
- 3
talking bot backend for windows-pc is not working, notebook need to be updated
#1518 opened by raj-ritu17 - 4
- 2
ModuleNotFoundError: No module named 'datasets'
#1461 opened by Aisuko - 1
NeuralChat TTS plugin unable to initialize due to missing dependency: librosa
#1490 opened by alexsin368 - 2
- 1
RAG example not working..
#1464 opened by guytamir - 3
- 2
Python3.11: Could not build wheels for cchardet, which is required to install pyproject.toml-based projects
#1469 opened by bbelky - 5
- 2
- 3
Support inference with WOQ and LoRA adapter
#1434 opened by Yuan0320 - 7
Requirements.txt underscores instead of dashes
#1421 opened by anthony-intel - 3
pip install missing dependencies
#1365 opened by jeremyfowers - 2
failed to create the serving
#1392 opened by RongLei-intel - 3
SageMaker does not support Transformers 4.34.1 which is required by ITREX
#1381 opened by eduand-alvarez - 3
- 1
i7-12700H CPU Tests
#1220 opened by ailxcds - 2
Running Stable Diffusion on IPEX CPU has error
#1345 opened by leopck - 2
[NeuralChat] Retrieval example failure
#1252 opened by Spycsh - 0
neuralchat int4 quantization failing during inference
#1267 opened by kta-intel - 5
QLoRA on CPU - Example ERROR: "undefined symbol"
#1287 opened by 42elenz - 2
Unexpected msg "ERROR - The chosen retrieval type remains outside the supported scope." if retrieval_type = 'default'
#1309 opened by redhairerINTEL - 1
rag plugin init failed if retrieval_type is bm25
#1315 opened by redhairerINTEL - 2
Conflict between ipex and pytorch
#1311 opened by redhairerINTEL - 3
RuntimeError: Chatbot instance has not been set.
#1308 opened by regmibijay - 0
[NeuralChat] Generate fails for LLaVA models
#1244 opened by dillonalaird - 2
422 Unprocessable Entity using Neural Chat via OpenAI interface with meta--lama/llama-2-7b-chat-hf
#1288 opened by brent-elliott - 5
- 6
Device does not exist / is not supported error with neuralchat deploy_chatbot_on_xpu notebook
#1276 opened by brent-elliott - 1
Cannot import name 'WeightOnlyLinear' when running `from intel_extension_for_transformers.transformers import AutoModelForCausalLM`
#1239 opened by Ankur-singh - 6
Installed intel-extension-for-transformers and I get an error - No module named 'intel_extension_for_pytorch'"
#1230 opened by sungkim11 - 4
Neural Chat Finetune Mistral Fails
#1181 opened by dillonalaird