avocardio's Stars
lucidrains/vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
karpathy/llama2.c
Inference Llama 2 in one file of pure C
primer/css
The CSS design system that powers GitHub
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features from various papers
facebookresearch/jepa
PyTorch code and models for V-JEPA self-supervised learning from video.
lucidrains/vector-quantize-pytorch
Vector (and Scalar) Quantization, in Pytorch
meagmohit/EEG-Datasets
A list of all public EEG-datasets
neuraloperator/neuraloperator
Learning in infinite dimension with neural operators.
epfLLM/meditron
Meditron is a suite of open-source medical Large Language Models (LLMs).
eliahuhorwitz/Academic-project-page-template
A project page template for academic papers. Demo at https://eliahuhorwitz.github.io/Academic-project-page-template/
TransformerLensOrg/TransformerLens
A library for mechanistic interpretability of GPT-style language models
keras-team/keras-core
A multi-backend implementation of the Keras API, with support for TensorFlow, JAX, and PyTorch.
martinshkreli/models
stock market models - have fun
abacaj/fine-tune-mistral
Fine-tune mistral-7B on 3090s, a100s, h100s
enjalot/latent-scope
A scientific instrument for investigating latent spaces
tatp22/multidim-positional-encoding
An implementation of 1D, 2D, and 3D positional encoding in Pytorch and TensorFlow
lucidrains/ema-pytorch
A simple way to keep track of an Exponential Moving Average (EMA) version of your pytorch model
openlists/ElectrophysiologyData
A list of openly available datasets in (mostly human) electrophysiology.
gaasher/I-JEPA
Implementation of I-JEPA from "Self-Supervised Learning from Images with a Joint-Embedding Predictive Architecture"
distillpub/post--misread-tsne
How to Use t-SNE Effectively
greydanus/mnist1d
A 1D analogue of the MNIST dataset for measuring spatial biases and answering Science of Deep Learning questions.
neelnanda-io/1L-Sparse-Autoencoder
lucidrains/complex-valued-transformer
Implementation of the transformer proposed in "Building Blocks for a Complex-Valued Transformer Architecture"
YimingQiao/Blitzcrank
[VLDB'24] Blitzcrank is to compress in-memory, row-store OLTP databases. It introduces a new entropy coding algorithm called Delayed Coding.
Convert-Group/pairing-functions
A collection of pairing functions.
Algomancer/VCReg
Minimal Implimentation of VCRec (2024) for collapse provention.
SlDo/Neuralink
š§ The homepage of Neuralink
michaela10c/neuralink-patents
An explanation of patents from Neuralink in PPT format (some including Python Jupyter notebooks). For educational purposes only.
RAZZULLIX/ghost
dictionary based compression algorithm
marlens123/autoSAM_pond_segmentation
Adapt AutoSAM to segment Arctic thermal infrared images into melt ponds, sea ice and ocean.