Pinned Repositories
ipex-llm
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, vLLM, GraphRAG, DeepSpeed, Axolotl, etc
nncf
Neural Network Compression Framework for enhanced OpenVINO™ inference
openvino_handbook
openvino_notebooks
📚 Jupyter notebook tutorials for OpenVINO™
PaddleGame
Add OpenVINO support to it
yolov8_cls_ov
yolov8-cls OpenVINO inference code example
yolov8_openvino
YOLOv8 classification/object detection/Instance segmentation/Pose model OpenVINO inference sample code
yolov8_openvino_cpp
YOLOv8 Inference C++ sample code based on OpenVINO C++ API
openvino
OpenVINO™ is an open source toolkit for optimizing and deploying AI inference
Paddle
PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)
openvino-book's Repositories
openvino-book/yolov8_openvino_cpp
YOLOv8 Inference C++ sample code based on OpenVINO C++ API
openvino-book/yolov8_openvino
YOLOv8 classification/object detection/Instance segmentation/Pose model OpenVINO inference sample code
openvino-book/openvino_handbook
openvino-book/yolov8_cls_ov
yolov8-cls OpenVINO inference code example
openvino-book/nncf
Neural Network Compression Framework for enhanced OpenVINO™ inference
openvino-book/openvino_notebooks
📚 Jupyter notebook tutorials for OpenVINO™
openvino-book/PaddleGame
Add OpenVINO support to it