Taoer1996's Stars
BryceZhuo/PolyCom
The official implementation of Polynomial Composition Activations: Unleashing the Dynamics of Large Language Models.
QwenLM/Qwen2.5-Coder
Qwen2.5-Coder is the code version of Qwen2.5, the large language model series developed by Qwen team, Alibaba Cloud.
ICT-GoKnow/KnowCoder
Official Repo of paper "KnowCoder: Coding Structured Knowledge into LLMs for Universal Information Extraction". In the paper, we propose KnowCoder, the most powerful large language model so far for universal information extraction.
HuangOwen/Awesome-LLM-Compression
Awesome LLM compression research papers and tools.
nebuly-ai/optimate
A collection of libraries to optimise AI model performances
howl-anderson/unlocking-the-power-of-llms
使用 Prompts 和 Chains 让 ChatGPT 成为神奇的生产力工具!Unlocking the power of LLMs.
google-research/tuning_playbook
A playbook for systematically maximizing the performance of deep learning models.
karpathy/minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
massCodeIO/massCode
A free and open source code snippets manager for developers
thunlp/PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
lcylmhlcy/Awesome-algorithm-interview
算法工程师(人工智能CV方向)面试问题及相关资料
tomohideshibata/BERT-related-papers
BERT-related papers
clarkkev/deep-coref
lucidrains/reformer-pytorch
Reformer, the efficient Transformer, in Pytorch
huggingface/tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
CoreyMSchafer/code_snippets
nusr/hacker-laws-zh
💻📖对开发人员有用的定律、理论、原则和模式。(Laws, Theories, Principles and Patterns that developers will find useful.)
TrickyGo/Dive-into-DL-TensorFlow2.0
本项目将《动手学深度学习》(Dive into Deep Learning)原书中的MXNet实现改为TensorFlow 2.0实现,项目已得到李沐老师的认可
huawei-noah/Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
sebastianruder/NLP-progress
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
heucoder/dimensionality_reduction_alo_codes
特征提取/数据降维:PCA、LDA、MDS、LLE、TSNE等降维算法的python实现
google-research/google-research
Google Research
openai/gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
microsoft/PowerToys
Windows system utilities to maximize productivity
google-research/text-to-text-transfer-transformer
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
perklet/reverse-interview-zh
技术面试最后反问面试官的话
brightmart/albert_zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
junyanz/pytorch-CycleGAN-and-pix2pix
Image-to-Image Translation in PyTorch
msgi/nlp-journey
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc.
halo-dev/halo
强大易用的开源建站工具。