dzunglt24's Stars
microsoft/binder
NVIDIA/FasterTransformer
Transformer related optimization, including BERT, GPT
yizhongw/self-instruct
Aligning pretrained language models with instruction data generated by themselves.
hpcaitech/ColossalAI
Making large AI models cheaper, faster and more accessible
google-research/deduplicate-text-datasets
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
XiangLi1999/PrefixTuning
Prefix-Tuning: Optimizing Continuous Prompts for Generation
mosaicml/composer
Supercharge Your Model Training
stanford-crfm/BioMedLM
karpathy/minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
dqxiu/ICL_PaperList
Paper List for In-context Learning 🌷
Dao-AILab/flash-attention
Fast and memory-efficient exact attention
RUCAIBox/TextBox
TextBox 2.0 is a text generation library with pre-trained language models
JonasGeiping/cramming
Cramming the training of a (BERT-type) language model into limited compute.
microsoft/torchscale
Foundation Architecture for (M)LLMs
shawroad/NER-Pytorch
ljynlp/W2NER
Source code for AAAI 2022 paper: Unified Named Entity Recognition as Word-Word Relation Classification
ai-systems/nli4ct
google-research/prompt-tuning
Original Implementation of Prompt Tuning from Lester, et al, 2021
NielsRogge/Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
microsoft/mup
maximal update parametrization (µP)
ovbystrova/InstructionNER
Unofficial implementation of paper "InstructionNER: A Multi-Task Instruction-Based Generative Framework for Few-shot NER" (https://arxiv.org/pdf/2203.03903v1.pdf)
tonyzhaozh/few-shot-learning
Few-shot Learning of GPT-3
dvanoni/notero
A Zotero plugin for syncing items and notes into Notion
chenghuige/U.S.-Patent-Phrase-to-Phrase-Matching
1st solution
microsoft/CodeBERT
CodeBERT
suicao/ai4code-baseline
Early solution for Google AI4Code competition
INK-USC/fewNER
Good Examples Make A Faster Learner: Simple Demonstration-based Learning for Low-resource NER (ACL 2022)
michaelgutmann/ml-pen-and-paper-exercises
Pen and paper exercises in machine learning
roboticcam/machine-learning-notes
My continuously updated Machine Learning, Probabilistic Models and Deep Learning notes and demos (2000+ slides) 我不间断更新的机器学习,概率模型和深度学习的讲义(2000+页)和视频链接