small-models
There are 9 repositories under small-models topic.
SqueezeAILab/SqueezeLLM
[ICML 2024] SqueezeLLM: Dense-and-Sparse Quantization
SqueezeAILab/KVQuant
[NeurIPS 2024] KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
aitomatic/openssa
OpenSSA: Small Specialist Agents based on Domain-Aware Neurosymbolic Agent (DANA) architecture for industrial problem-solving
MCG-NJU/AMD
[CVPR 2024] Asymmetric Masked Distillation for Pre-Training Small Foundation Models
logic-OT/Decoder-Only-LLM
This repository features a custom-built decoder-only language model (LLM) with a total of 37 million parameters 🔥. I train the model to be able to ask question from a given context
zhangyifei01/Awesome-Self-supervised-Learning-of-Tiny-Models
Overview of self-supervised learning of tiny models, including distillation-based methods (aks. self-supervised distillation) and non-distillation methods.
sfarhat/dapt
Code for "On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models"
ENSTA-U2IS-AI/optuMNIST
Help us define the Pareto front of small models for MNIST classification. Frugal AI.
antonio-f/Phi-3-Vision
Phi-3-Vision model test - running locally