Pinned Repositories
caffe
Caffe: a fast open framework for deep learning.
cnn-quantization
diffusers
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch
GenAIComps
GenAI components at micro-service level; GenAI service composer to create mega-service
GenAIExamples
Generative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
intel-extension-for-transformers
⚡ Build your chatbot within minutes on your favorite device; offer SOTA compression techniques for LLMs; run LLMs efficiently on Intel Platforms⚡
neural-compressor
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
opea_docs
This repo contains documents of the OPEA project
Paddle
PArallel Distributed Deep LEarning
ShuffleNet
This is a fast caffe implementation of ShuffleNet.
hshen14's Repositories
hshen14/caffe
Caffe: a fast open framework for deep learning.
hshen14/cnn-quantization
hshen14/diffusers
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch
hshen14/GenAIComps
GenAI components at micro-service level; GenAI service composer to create mega-service
hshen14/GenAIExamples
Generative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
hshen14/intel-extension-for-transformers
⚡ Build your chatbot within minutes on your favorite device; offer SOTA compression techniques for LLMs; run LLMs efficiently on Intel Platforms⚡
hshen14/neural-compressor
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
hshen14/opea_docs
This repo contains documents of the OPEA project
hshen14/Paddle
PArallel Distributed Deep LEarning
hshen14/ShuffleNet
This is a fast caffe implementation of ShuffleNet.