Pinned Repositories
openvino_2023.3
OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
openvino
OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
ipex-llm
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Baichuan, Mixtral, Gemma, Phi, etc.) on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, DeepSpeed, vLLM, FastChat, Axolotl, etc.
eugeooi's Repositories
eugeooi/openvino_2023.3
OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference