elttaes's Stars
google-deepmind/alphafold3
AlphaFold 3 inference pipeline.
PixArt-alpha/PixArt-alpha
PixArt-α: Fast Training of Diffusion Transformer for Photorealistic Text-to-Image Synthesis
MAGICS-LAB/DNABERT_2
[ICLR 2024] DNABERT-2: Efficient Foundation Model and Benchmark for Multi-Species Genome
sihyun-yu/REPA
Official Pytorch Implementation of Representation Alignment for Generation: Training Diffusion Transformers Is Easier Than You Think
chuanyangjin/fast-DiT
Fast Diffusion Models with Transformers
facebookresearch/DiT
Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"
RosettaCommons/RFdiffusion
Code for running RFdiffusion
RosettaCommons/protein_generator
Joint sequence and structure generation with RoseTTAFold sequence space diffusion
openai/glide-text2im
GLIDE: a diffusion-based text-conditional image synthesis model
bytedance/dplm
Official Implemetation of DPLM (ICML'24) - Diffusion Language Models Are Versatile Protein Learners
DefaultRui/BEV-Scene-Graph
[ICCV23] Bird’s-Eye-View Scene Graph for Vision-Language Navigation
lllyasviel/ControlNet
Let us control diffusion models!
kyegomez/zeta
Build high-performance AI models with modular building blocks
DreamFold/FoldFlow
FoldFlow: SE(3)-Stochastic Flow Matching for Protein Backbone Generation
generatebio/chroma
A generative model for programmable protein design
evolutionaryscale/esm
z-x-yang/DoraemonGPT
Official repository of DoraemonGPT: Toward Understanding Dynamic Scenes with Large Language Models
Lightning-AI/lit-llama
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
Dao-AILab/flash-attention
Fast and memory-efficient exact attention
ConnorJL/GPT2
An implementation of training for GPT2, supports TPUs
google-deepmind/alphafold
Open source code for AlphaFold 2.
a16z-infra/ai-town
A MIT-licensed, deployable starter kit for building and customizing your own version of AI town - a virtual town where AI characters live, chat and socialize.
eric-mitchell/direct-preference-optimization
Reference implementation for DPO (Direct Preference Optimization)
liguodongiot/llm-action
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
mlabonne/llm-course
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
karpathy/makemore
An autoregressive character-level language model for making more things
karpathy/nanoGPT
The simplest, fastest repository for training/finetuning medium-sized GPTs.
NVIDIA/Megatron-LM
Ongoing research training transformer models at scale
google-research/vision_transformer
Mythologyli/ZJU-Connect-for-Windows
基于 Qt 编写的 ZJU 网络客户端