Pinned Repositories
trax
Trax — Deep Learning with Clear Code and Speed
Awesome-Efficient-LLM
A curated list for Efficient Large Language Models
Algo-competitions
Solutions
dynamic-pooling
Efficient Transformers with Dynamic Token Pooling
nano-sparse-attention
The simplest implementation of recent Sparse Attention patterns for efficient LLM inference.
nanoT5
Fast & Simple repository for pre-training and fine-tuning T5-style models
NLP-OSS
Democratizing NLP!
NoTrainNoGain
Revisiting Efficient Training Algorithms For Transformer-based Language Models
piotrnawrot.github.io
Segmented-Sieve-of-Atkin
C++ implementation, easily parallelizable
PiotrNawrot's Repositories
PiotrNawrot/nanoT5
Fast & Simple repository for pre-training and fine-tuning T5-style models
PiotrNawrot/dynamic-pooling
Efficient Transformers with Dynamic Token Pooling
PiotrNawrot/nano-sparse-attention
The simplest implementation of recent Sparse Attention patterns for efficient LLM inference.
PiotrNawrot/piotrnawrot.github.io
PiotrNawrot/Algo-competitions
Solutions