JetRunner
PhD @ UCSD; Formerly @google-research, @huggingface, @microsoft Research Asia.
San Diego, CA
Pinned Repositories
transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
BERT-of-Theseus
⛵️The official PyTorch implementation for "BERT-of-Theseus: Compressing BERT by Progressive Module Replacing" (EMNLP 2020).
beyond-preserved-accuracy
Repo for EMNLP 2021 paper "Beyond Preserved Accuracy: Evaluating Loyalty and Robustness of BERT Compression"
dogwhistle
Baseline code for NAACL 2021 paper "Blow the Dog Whistle: A Chinese Dataset for Cant Understanding with Common Sense and World Knowledge"
LaPraDoR
🦮 Code and pretrained models for Findings of ACL 2022 paper "LaPraDoR: Unsupervised Pretrained Dense Retriever for Zero-Shot Text Retrieval"
MetaDistil
Code for ACL 2022 paper "BERT Learns to Teach: Knowledge Distillation with Meta Learning".
PABEE
Code for the paper "BERT Loses Patience: Fast and Robust Inference with Early Exit".
SuperICL
Code for "Small Models are Valuable Plug-ins for Large Language Models"
TuPaTE
Code for EMNLP 2022 paper "Efficiently Tuned Parameters are Task Embeddings"
baize-chatbot
Let ChatGPT teach your own chatbot in hours with a single GPU!
JetRunner's Repositories
JetRunner/BERT-of-Theseus
⛵️The official PyTorch implementation for "BERT-of-Theseus: Compressing BERT by Progressive Module Replacing" (EMNLP 2020).
JetRunner/SuperICL
Code for "Small Models are Valuable Plug-ins for Large Language Models"
JetRunner/MetaDistil
Code for ACL 2022 paper "BERT Learns to Teach: Knowledge Distillation with Meta Learning".
JetRunner/PABEE
Code for the paper "BERT Loses Patience: Fast and Robust Inference with Early Exit".
JetRunner/LaPraDoR
🦮 Code and pretrained models for Findings of ACL 2022 paper "LaPraDoR: Unsupervised Pretrained Dense Retriever for Zero-Shot Text Retrieval"
JetRunner/dogwhistle
Baseline code for NAACL 2021 paper "Blow the Dog Whistle: A Chinese Dataset for Cant Understanding with Common Sense and World Knowledge"
JetRunner/beyond-preserved-accuracy
Repo for EMNLP 2021 paper "Beyond Preserved Accuracy: Evaluating Loyalty and Robustness of BERT Compression"
JetRunner/TuPaTE
Code for EMNLP 2022 paper "Efficiently Tuned Parameters are Task Embeddings"
JetRunner/dope-score-chinese
Automatic metric for evaluating rap lyrics in Chinese (Mandarin).
JetRunner/unihan-lm
The official repository for "UnihanLM: Coarse-to-Fine Chinese-Japanese Language Model Pretraining with the Unihan Database", AACL-IJCNLP 2020
JetRunner/alpaca_eval
A validated automatic evaluator for instruction-following language models. High-quality, cheap, and fast.
JetRunner/PaSST-EE
JetRunner/pytorch-apex-docker
Up-to-date Dockerfile for PyTorch + apex
JetRunner/acl-anthology
Data and software for building the ACL Anthology.
JetRunner/alpa
Auto parallelization for large-scale neural networks
JetRunner/Awesome
:computer: 🎉 An awesome & curated list of best applications and tools for Windows.
JetRunner/awesome-1
😎 Awesome lists about all kinds of interesting topics
JetRunner/beir
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
JetRunner/chatgpt-google-extension
A browser extension that enhance search engines with ChatGPT
JetRunner/ChineseGLUE
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
JetRunner/CodeBERT
CodeBERT
JetRunner/FastChat
The release repo for "Vicuna: An Open Chatbot Impressing GPT-4"
JetRunner/flood-nlp
Testing flood in NLP with pretrained models. A course project for UCSD CSE 257.
JetRunner/l0-tune
JetRunner/mosesdecoder
Moses, the machine translation system
JetRunner/nlp
🤗nlp – Datasets and evaluation metrics for Natural Language Processing in NumPy, Pandas, PyTorch and TensorFlow
JetRunner/promptsource
Toolkit for collecting and applying templates of prompting instances
JetRunner/Sequence_Span_Rewriting
Code for EMNLP 2021 paper: Improving Sequence-to-Sequence Pre-training via Sequence Span Rewriting
JetRunner/transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
JetRunner/zuco-nlp
All NLP experiments described in ACL 2019 paper