skeletondyh's Stars
onnx/onnx
Open standard for machine learning interoperability
google-deepmind/deepmind-research
This repository contains implementations and illustrative code to accompany DeepMind publications
optuna/optuna
A hyperparameter optimization framework
huggingface/accelerate
🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
matplotlib/cheatsheets
Official Matplotlib cheat sheets
probml/pyprobml
Python code for "Probabilistic Machine learning" book by Kevin Murphy
facebookresearch/metaseq
Repo for external large-scale work
apache/iotdb
Apache IoTDB
cvxpy/cvxpy
A Python-embedded modeling language for convex optimization problems.
alpa-projects/alpa
Training and serving large-scale neural networks with auto parallelization.
facebookresearch/pycls
Codebase for Image Classification Research, written in PyTorch.
facebookresearch/DomainBed
DomainBed is a suite to test domain generalization algorithms
microsoft/mup
maximal update parametrization (µP)
RobertTLange/evosax
Evolution Strategies in JAX 🦎
tensorflow/kfac
An implementation of KFAC for TensorFlow
google-research/diffstride
TF/Keras code for DiffStride, a pooling layer with learnable strides.
ruocwang/darts-pt
[ICLR2021 Outstanding Paper] Rethinking Architecture Selection in Differentiable NAS
machinelearningnuremberg/WellTunedSimpleNets
[NeurIPS 2021] Well-tuned Simple Nets Excel on Tabular Datasets
nec-research/tf-imle
Tensorflow implementation and notebooks for Implicit Maximum Likelihood Estimation
PrincetonLIPS/RandomizedAutomaticDifferentiation
Experiment code for "Randomized Automatic Differentiation"
google-research/jax-influence
pomonam/Self-Tuning-Networks
PyTorch implementation of "STNs" and "Delta-STNs".
QUVA-Lab/COMBO
IST-DASLab/WoodFisher
Code accompanying the NeurIPS 2020 paper: WoodFisher (Singh & Alistarh, 2020)
squaresLab/VarCLR
VarCLR: Variable Semantic Representation Pre-training via Contrastive Learning
ondrejbohdal/evograd
Official PyTorch implementation of "EvoGrad: Efficient Gradient-Based Meta-Learning and Hyperparameter Optimization"
jparkerholder/PB2
Code for the Population-Based Bandits Algorithm, presented at NeurIPS 2020.
polo5/FDS
Gradient-based Hyperparameter Optimization Over Long Horizons
gpeyre/2021-NonCvxPro
prolearner/onlineLTL
python implementation of online learning to learn non-smooth algorithms