Pinned Repositories
openfold
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
gahdritz.github.io
My personal webpage
lit-llama
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
miscellany
Assorted Python projects.
OLMo
Modeling, training, eval, and inference code for OLMo
open-data-registry
A registry of publicly available datasets on AWS
prompt-tuning
Original Implementation of Prompt Tuning from Lester, et al, 2021
pytorch-lightning
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.
rtic
Official code for the paper "Modeling Real-Time Interactive Conversations as Timed Diarized Transcripts"
stablediffusion
High-Resolution Image Synthesis with Latent Diffusion Models
gahdritz's Repositories
gahdritz/gahdritz.github.io
My personal webpage
gahdritz/lit-llama
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
gahdritz/miscellany
Assorted Python projects.
gahdritz/OLMo
Modeling, training, eval, and inference code for OLMo
gahdritz/open-data-registry
A registry of publicly available datasets on AWS
gahdritz/prompt-tuning
Original Implementation of Prompt Tuning from Lester, et al, 2021
gahdritz/pytorch-lightning
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.
gahdritz/rtic
Official code for the paper "Modeling Real-Time Interactive Conversations as Timed Diarized Transcripts"
gahdritz/stablediffusion
High-Resolution Image Synthesis with Latent Diffusion Models
gahdritz/t5x
gahdritz/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.