zyz0000
Research Interests: Named Entity Recognition, Relation Extraction, Event Extraction, Multimodal Machine Learning
Pinned Repositories
-baseline
高鲁棒性要求下的领域事件检测任务baseline,转化为ner的形式做任务
2019-BDCI-FinancialEntityDiscovery
2019 BDCI互联网金融新实体发现
3D-Reconstruction-with-Deep-Learning-Methods
List of projects for 3d reconstruction
ACE2005_preprocessing
Alpaca-CoT
We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. Meanwhile, we created a new branch to build a Tabular LLM.(我们分别统一了丰富的IFT数据(如CoT数据,目前仍不断扩充)、多种训练效率方法(如lora,p-tuning)以及多种LLMs,三个层面上的接口,打造方便研究人员上手的LLM-IFT研究平台。同时tabular_llm分支构建了面向表格智能任务的LLM。
CS-Base
图解计算机网络、操作系统、计算机组成、数据库,共 1000 张图 + 50 万字,破除晦涩难懂的计算机基础知识,让天下没有难懂的八股文!🚀 在线阅读:https://xiaolincoding.com
FinBERT-MRC
RAPS
TFCox
This is a package used for survival analysis using deep learning method.
tianshou
An elegant, flexible, and superfast PyTorch deep Reinforcement Learning platform.
zyz0000's Repositories
zyz0000/FinBERT-MRC
zyz0000/CS-Base
图解计算机网络、操作系统、计算机组成、数据库,共 1000 张图 + 50 万字,破除晦涩难懂的计算机基础知识,让天下没有难懂的八股文!🚀 在线阅读:https://xiaolincoding.com
zyz0000/Awesome-Language-Model-on-Graphs
A curated list of papers and resources based on "Large Language Models on Graphs: A Comprehensive Survey".
zyz0000/Awesome-LLM-Inference
💻A small Collection for Awesome LLM Inference [Papers|Blogs|Docs] with codes, contains TensorRT-LLM, streaming-llm, SmoothQuant, WINT8/4, Continuous Batching, FlashAttention, PagedAttention etc.
zyz0000/Awesome-LLM-RAG-Application
the resources about the application based on LLM with RAG pattern
zyz0000/Awesome-LLM-Safety
A curated list of security-related papers, articles, and resources focused on Large Language Models (LLMs). This repository aims to provide researchers, practitioners, and enthusiasts with insights into the security implications, challenges, and advancements surrounding these powerful models.
zyz0000/Awesome-LLMs-in-Graph-tasks
A curated collection of research papers exploring the utilization of LLMs for graph-related tasks.
zyz0000/Awesome-Quantization-Papers
List of papers related to neural network quantization in recent AI conferences and journals.
zyz0000/exllama
A more memory-efficient rewrite of the HF transformers implementation of Llama for use with quantized weights.
zyz0000/FastChat
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
zyz0000/Langchain-Chatchat
Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM 等语言模型的本地知识库问答 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM) QA app with langchain
zyz0000/langchain-tutorials
A set of LangChain Tutorials from my youtube channel
zyz0000/LLaMA-Factory
Easy-to-use LLM fine-tuning framework (LLaMA, BLOOM, Mistral, Baichuan, Qwen, ChatGLM)
zyz0000/Llama2-Chinese
Llama中文社区,最好的中文Llama大模型,完全开源可商用
zyz0000/llm-course
Course with a roadmap and notebooks to get into Large Language Models (LLMs).
zyz0000/LLM-Travel
欢迎来到 "LLM-travel" 仓库!探索大语言模型(LLM)的奥秘 🚀。致力于深入理解、探讨以及实现与大模型相关的各种技术、原理和应用。
zyz0000/llm-universe
本项目是一个面向小白开发者的大模型应用开发教程,在线阅读地址:https://datawhalechina.github.io/llm-universe/
zyz0000/LLMs
专注于中文大语言模型:训练一个好的中文基座模型,指令微调和基于人类反馈的强化学习,数据收集、清洗和配比;
zyz0000/LLMs_interview_notes
该仓库主要记录 大模型(LLMs) 算法工程师相关的面试题
zyz0000/long-llms-learning
A repository sharing the literatures about long-context large language models, including the methodologies and the evaluation benchmarks
zyz0000/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
zyz0000/MedicalGPT
MedicalGPT: Training Your Own Medical GPT Model with ChatGPT Training Pipeline. 训练医疗大模型,实现了包括增量预训练、有监督微调、RLHF(奖励建模、强化学习训练)和DPO(直接偏好优化)。
zyz0000/MFTCoder
High Accuracy and efficiency multi-task fine-tuning framework for Code LLMs
zyz0000/Prompt-Engineering-Guide
🐙 Guides, papers, lecture, notebooks and resources for prompt engineering
zyz0000/promptsource
Toolkit for creating, sharing and using natural language prompts.
zyz0000/Skywork
Skywork series models are pre-trained on 3.2TB of high-quality multilingual (mainly Chinese and English) and code data. We have open-sourced the model, training data, evaluation data, evaluation methods, etc. 天工系列模型在3.2TB高质量多语言和代码数据上进行预训练。我们开源了模型参数,训练数据,评估数据,评估方法。
zyz0000/ToG
zyz0000/transformers-code
手把手带你实战 Huggingface Transformers 课程视频同步更新在B站与YouTube
zyz0000/Transformers-for-NLP-2nd-Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E including jump starting GPT-4, speech-to-text, text-to-speech, text to image generation with DALL-E, Google Cloud AI,HuggingGPT, and more
zyz0000/transformers_tasks
⭐️ NLP Algorithms with transformers lib. Supporting Text-Classification, Text-Generation, Information-Extraction, Text-Matching, RLHF, SFT etc.