ilyalasy's Stars
Hannibal046/xRAG
[Neurips2024] Source code for xRAG: Extreme Context Compression for Retrieval-augmented Generation with One Token
ArthurConmy/Automatic-Circuit-Discovery
pHaeusler/tinycatstories
ahans30/goldfish-loss
[NeurIPS 2024] Goldfish Loss: Mitigating Memorization in Generative LLMs
Adamdad/KnowledgeFactor
[ECCV2022] Factorizing Knowledge in Neural Networks
pratyushasharma/laser
The Truth Is In There: Improving Reasoning in Language Models with Layer-Selective Rank Reduction
togethercomputer/MoA
Together Mixture-Of-Agents (MoA) – 65.1% on AlpacaEval with OSS models
ItzCrazyKns/Perplexica
Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
openai/automated-interpretability
OSU-NLP-Group/GrokkedTransformer
Code for NeurIPS'24 paper 'Grokked Transformers are Implicit Reasoners: A Mechanistic Journey to the Edge of Generalization'
apartresearch/Neuron2Graph
Tools for exploring Transformer neuron behaviour, including input pruning and diversification.
flok/pydualsense
control your dualsense controller with python
EleutherAI/sae
Sparse autoencoders
zjunlp/AutoKG
LLMs for Knowledge Graph Construction and Reasoning: Recent Capabilities and Future Opportunities
taufeeque9/codebook-features
Sparse and discrete interpretability tool for neural networks
SJTU-IPADS/PowerInfer
High-speed Large Language Model Serving on PCs with Consumer-grade GPUs
neelnanda-io/1L-Sparse-Autoencoder
jbloomAus/SAELens
Training Sparse Autoencoders on Language Models
laekov/fastmoe
A fast MoE impl for PyTorch
Doubiiu/ToonCrafter
[SIGGRAPH Asia 2024, Journal Track] ToonCrafter: Generative Cartoon Interpolation
RobertCsordas/moeut
pengHTYX/Era3D
huggingface/trl
Train transformer language models with reinforcement learning.
ndif-team/nnsight
The nnsight package enables interpreting and manipulating the internals of deep learned models.
saprmarks/dictionary_learning
saprmarks/feature-circuits
vllm-project/vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
EleutherAI/lm-evaluation-harness
A framework for few-shot evaluation of language models.
bigscience-workshop/lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
mlech26l/ncps
PyTorch and TensorFlow implementation of NCP, LTC, and CfC wired neural models