Pinned Repositories
AAAAAAAyq.github.io
display the comparasion of original model and pruned model in different topics and questions
alpaca-lora
Instruct-tune LLaMA on consumer hardware
An-Website
Fourth iteration of my personal website built with Gatsby
booksource
《第一行代码 第2版》全书源代码
deit
Official DeiT repository
first-pr
《GitHub实践入门》实验用仓库
hub
A library for transfer learning by reusing parts of TensorFlow models.
LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
MAI-2021-Workshop
MAI 2021 Workshop
FastSAM
Fast Segment Anything
an-yongqi's Repositories
an-yongqi/AAAAAAAyq.github.io
display the comparasion of original model and pruned model in different topics and questions
an-yongqi/alpaca-lora
Instruct-tune LLaMA on consumer hardware
an-yongqi/An-Website
Fourth iteration of my personal website built with Gatsby
an-yongqi/booksource
《第一行代码 第2版》全书源代码
an-yongqi/deit
Official DeiT repository
an-yongqi/first-pr
《GitHub实践入门》实验用仓库
an-yongqi/hub
A library for transfer learning by reusing parts of TensorFlow models.
an-yongqi/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
an-yongqi/MAI-2021-Workshop
MAI 2021 Workshop
an-yongqi/peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
an-yongqi/Personal-Studying
an-yongqi/SAM
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
an-yongqi/sparsegpt
Code for the ICML 2023 paper "SparseGPT: Massive Language Models Can Be Accurately Pruned in One-Shot".
an-yongqi/wanda
A simple and effective LLM pruning approach.