Pinned Repositories
BiSHop
[ICML 2024] BiSHop: Bi-Directional Cellular Learning for Tabular Data with Generalized Sparse Modern Hopfield Model
DNABERT
[Bioinformatics] DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
DNABERT_2
[ICLR 2024] DNABERT-2: Efficient Foundation Model and Benchmark for Multi-Species Genome
DNABERT_S
DNABERT_S: Learning Species-Aware DNA Embedding with Genome Foundation Models
NonparametricHopfield
Nonparametric Modern Hopfield Models
OutEffHop
[ICML 2024] Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
SparseModernHopfield
[NeurIPS 2023] On Sparse Modern Hopfield Model
STanHop
[ICLR 2024] STanHop: Sparse Tandem Hopfield Model for Memory-Enhanced Time Series Prediction
UHop
[ICML 2024] Uniform Memory Retrieval with Larger Capacity for Modern Hopfield Models
MAGICS-LAB's Repositories
MAGICS-LAB/DNABERT_2
[ICLR 2024] DNABERT-2: Efficient Foundation Model and Benchmark for Multi-Species Genome
MAGICS-LAB/DNABERT_S
DNABERT_S: Learning Species-Aware DNA Embedding with Genome Foundation Models
MAGICS-LAB/SparseModernHopfield
[NeurIPS 2023] On Sparse Modern Hopfield Model
MAGICS-LAB/OutEffHop
[ICML 2024] Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
MAGICS-LAB/STanHop
[ICLR 2024] STanHop: Sparse Tandem Hopfield Model for Memory-Enhanced Time Series Prediction
MAGICS-LAB/NonparametricHopfield
Nonparametric Modern Hopfield Models
MAGICS-LAB/UHop
[ICML 2024] Uniform Memory Retrieval with Larger Capacity for Modern Hopfield Models
MAGICS-LAB/BiSHop
[ICML 2024] BiSHop: Bi-Directional Cellular Learning for Tabular Data with Generalized Sparse Modern Hopfield Model
MAGICS-LAB/DNABERT
[Bioinformatics] DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome