sifanexisted's Stars
lucidrains/vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
dair-ai/ml-visuals
🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.
facebookresearch/mae
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
facebookresearch/schedule_free
Schedule-Free Optimization in PyTorch
lucidrains/flamingo-pytorch
Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch
lucidrains/perceiver-pytorch
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
google-research/vmoe
Pang-Yatian/Point-MAE
[ECCV2022] Masked Autoencoders for Point Cloud Self-supervised Learning
SpeedyWeather/SpeedyWeather.jl
Play atmospheric modelling like it's LEGO.
krasserm/perceiver-io
A PyTorch implementation of Perceiver, Perceiver IO and Perceiver AR with PyTorch Lightning scripts for distributed training
marcoamonteiro/pi-GAN
Nek5000/Nek5000
our classic
google-deepmind/torax
TORAX: Tokamak transport simulation in JAX
deepmodeling/jax-fem
Differentiable Finite Element Method with JAX
PaddlePaddle/PaddleScience
PaddleScience is SDK and library for developing AI-driven scientific computing applications based on PaddlePaddle.
Nek5000/nekRS
our next generation fast and scalable CFD code
naver-ai/rope-vit
[ECCV 2024] Official PyTorch implementation of RoPE-ViT "Rotary Position Embedding for Vision Transformer"
PredictiveIntelligenceLab/jaxpi
neuraloperator/Geo-FNO
Geometry-Aware Fourier Neural Operator (Geo-FNO)
echowve/meshGraphNets_pytorch
PyTorch implementations of Learning Mesh-based Simulation With Graph Networks
f0uriest/interpax
Interpolation and function approximation with JAX
camlab-ethz/poseidon
Code for the paper "Poseidon: Efficient Foundation Models for PDEs"
ml-jku/UPT
Code for the paper Universal Physics Transformers
mlpc-ucsd/ViTGAN
HaoZhongkai/DPOT
Code for "DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training"
thu-ml/DPOT
Code for "DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training"
LouisSerrano/coral
PredictiveIntelligenceLab/cvit
felix-lyx/prose
PROSE: Predicting Multiple Operators and Symbolic Expressions
junfeng-chen/position_induced_transformer
PyTorch implemention of the Position-induced Transformer for operator learning in partial differential equations