jfrankle
Chief Scientist (Neural Networks) at Databricks. Making deep learning efficient and accessible for everyone.
DatabricksNew York, NY
Pinned Repositories
lottery-ticket-hypothesis
A reimplementation of "The Lottery Ticket Hypothesis" (Frankle and Carbin) on MNIST.
composer
Train neural networks up to 7x faster
llm-foundry
LLM training code for MosaicML foundation models
mosaicml-examples
Fast and flexible reference benchmarks
open_lth
A repository in preparation for open-sourcing lottery ticket hypothesis code.
refinements-popl-16
Artifact of "Example-Directed Synthesis: A Type-Theoretic Implementation" by Frankle, Osera, Walker, and Zdancewic.
scratch
scratch work
streaming
A Data Streaming Library for Efficient Neural Network Training
composer
Supercharge Your Model Training
llm-foundry
LLM training code for Databricks foundation models
jfrankle's Repositories
jfrankle/refinements-popl-16
Artifact of "Example-Directed Synthesis: A Type-Theoretic Implementation" by Frankle, Osera, Walker, and Zdancewic.
jfrankle/open_lth
A repository in preparation for open-sourcing lottery ticket hypothesis code.
jfrankle/composer
Train neural networks up to 7x faster
jfrankle/llm-foundry
LLM training code for MosaicML foundation models
jfrankle/mosaicml-examples
Fast and flexible reference benchmarks
jfrankle/scratch
scratch work
jfrankle/streaming
A Data Streaming Library for Efficient Neural Network Training