Pinned Repositories
extreme-bert
ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Customized BERT”.
LMFlow
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
active-prompt
Source code for the paper "Active Prompting with Chain-of-Thought for Large Language Models"
awesome-domain-adaptation-NLP
domain adaptation in NLP
Black-Box-Prompt-Learning
Source code for the TMLR paper "Black-Box Prompt Learning for Pre-trained Language Models"
ChatGPTPapers
Must-read papers, related blogs and API tools on the pre-training and tuning methods for ChatGPT.
DaVinci
Source code for the paper "Prefix Language Models are Unified Modal Learners"
R-Tuning
Source code for the NAACL 2024 paper entitled "R-Tuning: Instructing Large Language Models to Say 'I Don't Know'"
T-DNA
Source code for the ACL-IJCNLP 2021 paper entitled "T-DNA: Taming Pre-trained Language Models with N-gram Representations for Low-Resource Domain Adaptation" by Shizhe Diao et al.
ZEN
A BERT-based Chinese Text Encoder Enhanced by N-gram Representations
shizhediao's Repositories
shizhediao/ChatGPTPapers
Must-read papers, related blogs and API tools on the pre-training and tuning methods for ChatGPT.
shizhediao/active-prompt
Source code for the paper "Active Prompting with Chain-of-Thought for Large Language Models"
shizhediao/R-Tuning
Source code for the NAACL 2024 paper entitled "R-Tuning: Instructing Large Language Models to Say 'I Don't Know'"
shizhediao/Black-Box-Prompt-Learning
Source code for the TMLR paper "Black-Box Prompt Learning for Pre-trained Language Models"
shizhediao/DaVinci
Source code for the paper "Prefix Language Models are Unified Modal Learners"
shizhediao/automate-cot
Source code for the paper "Automatic Prompt Augmentation and Selection with Chain-of-Thought from Labeled Data"
shizhediao/T-DNA
Source code for the ACL-IJCNLP 2021 paper entitled "T-DNA: Taming Pre-trained Language Models with N-gram Representations for Low-Resource Domain Adaptation" by Shizhe Diao et al.
shizhediao/HashTation
Source code for the paper "Hashtag-Guided Low-Resource Tweet Classification"
shizhediao/Doolittle
Source code for the EMNLP 2023 paper entitled "Doolittle: Benchmarks and Corpora for Academic Writing Formalization" by Shizhe Diao et al.
shizhediao/PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
shizhediao/Awesome-Chinese-LLM
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
shizhediao/Chain-of-ThoughtsPapers
A trend starts from "Chain of Thought Prompting Elicits Reasoning in Large Language Models".
shizhediao/directional-preference-alignment
Directional Preference Alignment
shizhediao/LLMs-In-China
**大模型
shizhediao/openai-cookbook
Examples and guides for using the OpenAI API
shizhediao/trl
Train transformer language models with reinforcement learning.
shizhediao/awesome-ChatGPT-resource-zh
精选 OpenAI 的 [ChatGPT](https://chat.openai.com) 资源清单, 跟随最新资源并添加中文相关Work
shizhediao/awesome-RLHF
A curated list of reinforcement learning with human feedback resources (continually updated)
shizhediao/bolei_awesome_posters
CVPR and NeurIPS poster examples and templates. May we have in-person poster session soon!
shizhediao/CCLM
Cross-View Language Modeling: Towards Unified Cross-Lingual Cross-Modal Pre-training
shizhediao/chatgpt-clone
this app is a ChatGPT Clone with DALL.E using OpenAIs text-davinci-003 and image generation Model
shizhediao/ChatGPT-Next-Web
A well-designed cross-platform ChatGPT UI (Web / PWA / Linux / Win / MacOS). 一键拥有你自己的跨平台 ChatGPT 应用。
shizhediao/CoOp
Prompt Learning for Vision-Language Models (IJCV'22, CVPR'22)
shizhediao/Instruction-Tuning-Papers
Reading list of Instruction-tuning. A trend starts from Natrural-Instruction (ACL 2022), FLAN (ICLR 2022) and T0 (ICLR 2022).
shizhediao/keyphrase-generation-multigrain-attention
Keyphrase Generation with Cross-Document Attention
shizhediao/MiniGPT-4
MiniGPT-4: Enhancing Vision-language Understanding with Advanced Large Language Models
shizhediao/natural-instructions
Expanding natural instructions
shizhediao/Paper-Picture-Writing-Code
Paper Picture Writing Code
shizhediao/shizhediao
shizhediao/X-VLM
X-VLM: Multi-Grained Vision Language Pre-Training