Pinned Repositories
all-normalization-transformer
A simple Transformer where the softmax has been replaced with normalization
attalos
CLAP
Contrastive Language-Audio Pretraining
CLIP-S
Contrastive pre-training of speech and transcripts, in the style of CLIP
CLIP_JAX
Contrastive Language-Image Pretraining
humongous-rs
A Rust pipeline for extracting HUMONGOUS, a dataset of web-based text extracted from Common Crawl and ready for multilingual language modeling.
simple-diffusion-model
Pedagogical codebase for a simplified score-based generative model design, with training loop
simple-parallel-transformer
As it says on the tin, this repo has a simple implementation of a transformer model, with some borrowed efficiency improvements. The purpose is mainly pedagogical.
simple-vq-vae
Pedagogical codebase for a simplified VQ-VAE based on attention and linear interpolation
uspto_patent_data_parser
A python tool for reading, parsing and finding patent using the United States Patent and Trademark (USPTO) Bulk Data Storage System.
cfoster0's Repositories
cfoster0/CLAP
Contrastive Language-Audio Pretraining
cfoster0/simple-diffusion-model
Pedagogical codebase for a simplified score-based generative model design, with training loop
cfoster0/simple-parallel-transformer
As it says on the tin, this repo has a simple implementation of a transformer model, with some borrowed efficiency improvements. The purpose is mainly pedagogical.
cfoster0/humongous-rs
A Rust pipeline for extracting HUMONGOUS, a dataset of web-based text extracted from Common Crawl and ready for multilingual language modeling.
cfoster0/CLIP-S
Contrastive pre-training of speech and transcripts, in the style of CLIP
cfoster0/CLIP_JAX
Contrastive Language-Image Pretraining
cfoster0/simple-vq-vae
Pedagogical codebase for a simplified VQ-VAE based on attention and linear interpolation
cfoster0/uspto_patent_data_parser
A python tool for reading, parsing and finding patent using the United States Patent and Trademark (USPTO) Bulk Data Storage System.
cfoster0/all-normalization-transformer
A simple Transformer where the softmax has been replaced with normalization
cfoster0/awesome-NeRF
A curated list of awesome neural radiance fields papers
cfoster0/cfoster0-github.io
cfoster0/chuck
ChucK Music Programming Language
cfoster0/DynamicGrids.jl
A framework for gridded simulations in Julia
cfoster0/gate-all-around-network
cfoster0/GPT-Neo-visual-grounding
Visually ground GPT-Neo 1.3b and 2.7b
cfoster0/gym
A toolkit for developing and comparing reinforcement learning algorithms.
cfoster0/info
A hub for onboarding & other information.
cfoster0/lm_evaluation_harness
cfoster0/longform
cfoster0/new-website
New website for EleutherAI based on Hugo static site generator
cfoster0/pile-explorer
For exploring the data and documenting its limitations
cfoster0/pile_united_nations
A script for collecting the United Nations Digital Library dataset in a language modelling friendly format.
cfoster0/pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
cfoster0/self-attention-experiments-vision
A project about replicating, evaluating and scaling up self-attention based models in vision.
cfoster0/simple-parallel-transformer-jax
cfoster0/tolman-eichenbaum-formers
Looking to build out some code inspired by https://openreview.net/forum?id=B8DVo9B1YE0
cfoster0/transformer-memorization
Experiments quantifying the memorization properties of transformers
cfoster0/transformer-utils
cfoster0/vector-quantize-pytorch
Vector Quantization, in Pytorch
cfoster0/x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers