Pinned Repositories
awesome-courses
:books: List of awesome university courses for learning Computer Science!
Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Awesome-Knowledge-Distillation-of-LLMs
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
dive-into-llms
《动手学大模型Dive into LLMs》系列编程实践教程
EasyEdit
An Easy-to-use Knowledge Editing Framework for LLMs.
IterDE
PtCoding
A framework for quick start to write codes for Pytorch Coding.
FastKGE
[IJCAI 2024] Fast and Continual Knowledge Graph Embedding via Incremental LoRA
IncDE
[AAAI 2024] Towards Continual Knowledge Graph Embedding via Incremental Distillation
IterDE
[AAAI 2023] IterDE: An Iterative Knowledge Distillation Framework for Knowledge Graph Embeddings
ljj-007's Repositories
ljj-007/IterDE
ljj-007/PtCoding
A framework for quick start to write codes for Pytorch Coding.
ljj-007/awesome-courses
:books: List of awesome university courses for learning Computer Science!
ljj-007/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
ljj-007/Awesome-Knowledge-Distillation-of-LLMs
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
ljj-007/dive-into-llms
《动手学大模型Dive into LLMs》系列编程实践教程
ljj-007/EasyEdit
An Easy-to-use Knowledge Editing Framework for LLMs.
ljj-007/KGE
Some papers on Knowledge Graph Embedding(KGE)
ljj-007/knowledge-distillation-papers
knowledge distillation papers
ljj-007/OpenKE
An Open-Source Package for Knowledge Embedding (KE)
ljj-007/ljj-007
ljj-007/ljj-007.github.io
AcadHomepage: A Modern and Responsive Academic Personal Homepage
ljj-007/RepDistiller
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods