Riccorl's Stars
upscayl/upscayl
🆙 Upscayl - #1 Free and Open Source AI Image Upscaler for Linux, MacOS and Windows.
myshell-ai/OpenVoice
Instant voice cloning by MyShell.
karpathy/llm.c
LLM training in simple, raw C/CUDA
qdrant/qdrant
Qdrant - High-performance, massive-scale Vector Database for the next generation of AI. Also available in the cloud https://cloud.qdrant.io/
lit/lit
Lit is a simple library for building fast, lightweight web components.
stanfordnlp/dspy
DSPy: The framework for programming—not prompting—foundation models
astral-sh/uv
An extremely fast Python package installer and resolver, written in Rust.
QwenLM/Qwen
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
fastfetch-cli/fastfetch
Like neofetch, but much faster because written mostly in C.
sparkle-project/Sparkle
A software update framework for macOS
apple/corenet
CoreNet: A library for training deep neural networks
EleutherAI/lm-evaluation-harness
A framework for few-shot evaluation of language models.
krishnadey30/LeetCode-Questions-CompanyWise
Contains Company Wise Questions sorted based on Frequency and all time
pytorch/torchtune
A Native-PyTorch Library for LLM Fine-tuning
ollama/ollama-python
Ollama Python library
NVIDIA/ChatRTX
A developer reference project for creating Retrieval Augmented Generation (RAG) chatbots on Windows using TensorRT-LLM
cohere-ai/cohere-toolkit
Cohere Toolkit is a collection of prebuilt components enabling users to quickly build and deploy RAG applications.
AnswerDotAI/fsdp_qlora
Training LLMs with QLoRA + FSDP
pytorch/torchtitan
A native PyTorch Library for large model training
databricks/megablocks
Lightning-AI/lightning-thunder
Make PyTorch models up to 40% faster! Thunder is a source to source compiler for PyTorch. It enables using different hardware executors at once; across one or thousands of GPUs.
mosaicml/streaming
A Data Streaming Library for Efficient Neural Network Training
ibm-granite/granite-code-models
Granite Code Models: A Family of Open Foundation Models for Code Intelligence
earwig/mwparserfromhell
A Python parser for MediaWiki wikicode
rapidsai/raft
RAFT contains fundamental widely-used algorithms and primitives for machine learning and information retrieval. The algorithms are CUDA-accelerated and form building blocks for more easily writing high performance applications.
mosaicml/examples
Fast and flexible reference benchmarks
huggingface/llm_training_handbook
An open collection of methodologies to help with successful training of large language models.
Lightning-AI/litdata
Streamline data pipelines for AI. Process datasets across 1000s of machines, and optimize data for blazing fast model training.
neubig/research-career-tools
foundation-model-stack/fms-fsdp
🚀 Efficiently (pre)training foundation models with native PyTorch features, including FSDP for training and SDPA implementation of Flash attention v2.