dimmu's Stars
microsoft/BitNet
Official inference framework for 1-bit LLMs
xjdr-alt/entropix
Entropy Based Sampling and Parallel CoT Decoding
alexdobin/STAR
RNA-seq aligner
SysSec-KAIST/LTESniffer
An Open-source LTE Downlink/Uplink Eavesdropper
zml/zml
High performance AI inference stack. Built for production. @ziglang / @openxla / MLIR / @bazelbuild
feizc/FluxMusic
Text-to-Music Generation with Rectified Flow Transformers
chaidiscovery/chai-lab
Chai-1, SOTA model for biomolecular structure prediction
PaddlePaddle/PaddleHelix
Bio-Computing Platform Featuring Large-Scale Representation Learning and Multi-Task Deep Learning “螺旋桨”生物计算工具集
sokrypton/ColabDesign
Making Protein Design accessible to all via Google Colab!
felafax/felafax
Felafax is building AI infra for non-NVIDIA GPUs
google-deepmind/searchless_chess
Grandmaster-Level Chess Without Search
mims-harvard/PrimeKG
Precision Medicine Knowledge Graph (PrimeKG)
pwwang/scplotter
scplotter is an R package that is built upon plotthis. It provides a set of functions to visualize single-cell sequencing data in an easy and efficient way.
google-research/talk-like-a-graph
nuwandavek/karpathify
chandar-lab/AMPLIFY
ScaledFoundations/MatMamba
Code and pretrained models for the paper: "MatMamba: A Matryoshka State Space Model"
joshuapjacob/crypto-volatility-surface
A dashboard to visualize cryptocurrency implied volatility surfaces constructed with option data from Binance.
zyushun/hessian-spectrum
Code for the paper: Why Transformers Need Adam: A Hessian Perspective
nf-core/rnavar
gatk4 RNA variant calling pipeline
DeepGraphLearning/GearBind
Pretrainable geometric graph neural network for antibody affinity maturation
grimme-lab/MindlessGen
Mindless molecule generator in a Python package.
PKUliujl/GeoSeqBuilder
CuiMiao-HIT/miniSNV
eralp85/Linear-Algebra-Gilbert-Strang
Linear-Algebra--Gilbert-Strang
opallab/positional_attention
Source code for the paper "Positional Attention: Out-of-Distribution Generalization and Expressivity for Neural Algorithmic Reasoning"
amudide/switch_sae
Efficient Dictionary Learning with Switch Sparse Autoencoders (SAEs)
Armilius/PISTE
ni-lab/finetuning-enformer
Code for improving the performance of sequence-to-expression models for making individual-specific gene expression predictions by fine-tuning them on personal genome and transcriptome data.
piercelab/tcr_docking_angle