Pinned Repositories
accelerate
🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
candle
Minimalist ML framework for Rust
datasets
🤗 The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation tools
diffusers
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
optimum
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
pytorch-image-models
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNet-V3/V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more
tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
trl
Train transformer language models with reinforcement learning.
Hugging Face's Repositories
huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
huggingface/diffusers
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
huggingface/datasets
🤗 The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation tools
huggingface/peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
huggingface/candle
Minimalist ML framework for Rust
huggingface/tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
huggingface/trl
Train transformer language models with reinforcement learning.
huggingface/text-generation-inference
Large Language Model Text Generation Inference
huggingface/accelerate
🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
huggingface/chat-ui
Open source codebase powering the HuggingChat app
huggingface/deep-rl-class
This repo contains the syllabus of the Hugging Face Deep Reinforcement Learning Course.
huggingface/autotrain-advanced
🤗 AutoTrain Advanced
huggingface/blog
Public repo for HF blog posts
huggingface/evaluate
🤗 Evaluate: A library for easily evaluating machine learning models and datasets.
huggingface/huggingface_hub
The official Python client for the Huggingface Hub.
huggingface/huggingface.js
Utilities to use the Hugging Face Hub API
huggingface/nanotron
Minimalistic large language model 3D-parallelism training
huggingface/optimum-nvidia
huggingface/dataset-viewer
Lightweight web API for visualizing and exploring any dataset - computer vision, speech, text, and tabular - stored on the Hugging Face Hub
huggingface/quanto
A pytorch Quantization Toolkit
huggingface/optimum-intel
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
huggingface/lighteval
LightEval is a lightweight LLM evaluation suite that Hugging Face has been using internally with the recently released LLM data processing library datatrove and LLM training library nanotron.
huggingface/ratchet
A cross-platform browser ML framework.
huggingface/optimum-benchmark
A unified multi-backend utility for benchmarking Transformers, Timm, Diffusers and Sentence-Transformers with full support of Optimum's hardware optimizations & quantization schemes.
huggingface/optimum-neuron
Easy, fast and very cheap training and inference on AWS Trainium and Inferentia chips.
huggingface/optimum-habana
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
huggingface/data-is-better-together
Let's build better datasets, together!
huggingface/optimum-tpu
Google TPU optimizations for transformers models
huggingface/lm-evaluation-harness
A framework for few-shot evaluation of language models.
huggingface/tei-gaudi
A blazing fast inference solution for text embeddings models