adam
There are 114 repositories under adam topic.
LiyuanLucasLiu/RAdam
On the Variance of the Adaptive Learning Rate and Beyond
Malinskiy/adam
Adam (or adm) is a coroutine-friendly Android Debug Bridge client written in Kotlin
Tony-Y/pytorch_warmup
Learning Rate Warmup in PyTorch
5hirish/adam_qas
ADAM - A Question Answering System. Inspired from IBM Watson
CyberZHG/keras-radam
RAdam implemented in Keras & TensorFlow
GSORF/Visual-GPS-SLAM
This is a repo for my master thesis research about the Fusion of Visual SLAM and GPS. It contains the research paper, code and other interesting data.
FujiNetWIFI/fujinet-firmware
8-bit systems to ESP32 WiFi Multifunction Firmware
Nasdin/ReinforcementLearning-AtariGame
Pytorch LSTM RNN for reinforcement learning to play Atari games from OpenAI Universe. We also use Google Deep Mind's Asynchronous Advantage Actor-Critic (A3C) Algorithm. This is much superior and efficient than DQN and obsoletes it. Can play on many games
AnicetNgrt/jiro-nn
A Deep Learning and preprocessing framework in Rust with support for CPU and GPU.
GLambard/AdamW_Keras
AdamW optimizer for Keras
lucidrains/adam-atan2-pytorch
Implementation of the proposed Adam-atan2 from Google Deepmind in Pytorch
bentrevett/a-tour-of-pytorch-optimizers
A tour of different optimization algorithms in PyTorch.
YanaiEliyahu/AdasOptimizer
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
davda54/ada-hessian
Easy-to-use AdaHessian optimizer (PyTorch)
Mrpatekful/swats
Unofficial implementation of Switching from Adam to SGD optimization in PyTorch.
nengwp/Lion-vs-Adam
Lion and Adam optimization comparison
polyfem/polysolve
Easy-to-use linear and non-linear solver
shreyansh26/ML-Optimizers-JAX
Toy implementations of some popular ML optimizers using Python/JAX
uclaml/Padam
Partially Adaptive Momentum Estimation method in the paper "Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks" (accepted by IJCAI 2020)
zmyzheng/Neural-Networks-and-Deep-Learning
Deep learning projects including applications (face recognition, neural style transfer, autonomous driving, sign language reading, music generation, translation, speech recognition and NLP) and theories (CNNs, RNNs, LSTM, Adam, Dropout, BatchNorm, Xavier/He initialization, hyperparameter tuning, regularization, optimization, Residual Networks). Deep Learning Specialization by Andrew Ng, deeplearning.ai
JRC1995/DemonRangerOptimizer
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
harshraj11584/Paper-Implementation-Overview-Gradient-Descent-Optimization-Sebastian-Ruder
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
angetato/Optimizers-for-Tensorflow
Adam, NAdam and AAdam optimizers
lonePatient/NovoGrad-pytorch
pytorch implement of NovoGrad Optimizer
pre-eth/adam
ADAM is an actively developed CSPRNG inspired by ISAAC64
mknbv/adashift
AdaShift optimizer implementation in PyTorch
hiroyuki-kasai/SimpleDeepNetToolbox
Simple MATLAB toolbox for deep learning network: Version 1.0.3
RConsortium/submissions-pilot3-adam
Development repo for pilot3 submission to FDA - ADaM
B612-Asteroid-Institute/adam_home
ADAM python client and notebooks
atgenomix/deepvariant-on-spark
DeepVariant-on-Spark is a germline short variant calling pipeline that runs Google DeepVariant on Apache Spark at scale.
echocat/adam
Addon which enhances all user profiles of confluence. It also adds an advanced people directory. The whole addon is configurable by means of an XML, can be localized, supports Velocity templates and supports view and edit restrictions.
nducthang/Optimization-DeepLearning-Vietnamese
The optimization methods in deep learning explained by Vietnamese such as gradient descent, momentum, NAG, AdaGrad, Adadelta, RMSProp, Adam, Adamax, Nadam, AMSGrad.
OptimalFoundation/awesome-optimizers
Literature survey of convex optimizers and optimisation methods for deep-learning; made especially for optimisation researchers with ❤️
B612-Asteroid-Institute/astrodynamics
Orbit propagation, orbit determination, and analysis code
amirrezarajabi/Neural-Network-implementation-from-scratch
implementation of neural network from scratch only using numpy (Conv, Fc, Maxpool, optimizers and activation functions)
tak27/adam
This is an implementation of Adam: A Method for Stochastic Optimization.