Pinned Repositories
LLaMA-BitNet
LLaMA-BitNet is a repository dedicated to empowering users to train their own BitNet models built upon LLaMA 2 model, inspired by the groundbreaking paper 'The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits'.
lit-llama
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
sc2-benchmark
[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"
supervised-compression
[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"
andreamigliorati's Repositories
andreamigliorati doesn’t have any repository yet.