Pinned Repositories
Round1
Environment and tasks for the first round of the General AI Challenge.
bitsandbytes
Library for 8-bit optimizers and quantization routines.
DeBERTa
The implementation of DeBERTa
Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
gptq
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
kaggle-trackML
Description of #7 Solution to Kaggle competition TrackML in 2018
nanoGPT
The simplest, fastest repository for training/finetuning medium-sized GPTs.
penzai
A JAX research toolkit for building, editing, and visualizing neural networks.
trianxy's Repositories
trianxy/kaggle-trackML
Description of #7 Solution to Kaggle competition TrackML in 2018
trianxy/bitsandbytes
Library for 8-bit optimizers and quantization routines.
trianxy/DeBERTa
The implementation of DeBERTa
trianxy/Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
trianxy/gptq
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
trianxy/nanoGPT
The simplest, fastest repository for training/finetuning medium-sized GPTs.
trianxy/penzai
A JAX research toolkit for building, editing, and visualizing neural networks.