Pinned Repositories
InfiniTransformer
Unofficial PyTorch/🤗Transformers(Gemma/Llama3) implementation of Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention
EasyContext
Memory optimization and training recipes to extrapolate language models' context length to 1 million tokens, with minimal hardware.
LWM
Large World Model -- Modeling Text and Video with Millions Context
ring-attention-pytorch
Implementation of đź’Ť Ring Attention, from Liu et al. at Berkeley AI, in Pytorch
textflint
Text Robustness Evaluation Platform
FastCkpt
Python package for rematerialization-aware gradient checkpointing
LightSeq
Official repository for LightSeq: Sequence Level Parallelism for Distributed Training of Long Context Transformers
ring-flash-attention
Ring attention implementation with flash attention
LzhinFdu's Repositories
LzhinFdu/textflint
Text Robustness Evaluation Platform