adam-optimizer
There are 392 repositories under adam-optimizer topic.
LiyuanLucasLiu/RAdam
On the Variance of the Adaptive Learning Rate and Beyond
parasdahal/deepnet
Deep learning library in plain Numpy.
SirRob1997/Crowded-Valley---Results
This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"
tirtharajdash/CS-F425_Deep-Learning
CS F425 Deep Learning course at BITS Pilani (Goa Campus)
YanaiEliyahu/AdasOptimizer
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
nengwp/Lion-vs-Adam
Lion and Adam optimization comparison
aditya9211/SVHN-CNN
Google Street View House Number(SVHN) Dataset, and classifying them through CNN
v-iashin/CS231n
PyTorch/Tensorflow solutions for Stanford's CS231n: "CNNs for Visual Recognition"
yashkant/padam-tensorflow
Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge
sagarvegad/Adam-optimizer
Implemented Adam optimizer in python
shreyansh26/ML-Optimizers-JAX
Toy implementations of some popular ML optimizers using Python/JAX
Arko98/Gradient-Descent-Algorithms
A collection of various gradient descent algorithms implemented in Python from scratch
VivianoRiccardo/Learning-Lab-C-Library
This library provides a set of functionalities for different type of deep learning (and ML) algorithms in C
aromanro/MachineLearning
From linear regression towards neural networks...
rdspring1/Count-Sketch-Optimizers
A compressed adaptive optimizer for training large-scale deep learning models using PyTorch
RudreshVeerkhare/CustomXGBoost
Modified XGBoost implementation from scratch with Numpy using Adam and RSMProp optimizers.
yaricom/TimeSeriesLearning
The project aimed to implement Deep NN / RNN based solution in order to develop flexible methods that are able to adaptively fillin, backfill, and predict time-series using a large number of heterogeneous training datasets.
harshraj11584/Paper-Implementation-Overview-Gradient-Descent-Optimization-Sebastian-Ruder
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
Janus-Shiau/lookahead_tensorflow
Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
theroyakash/Adam
Implementation of Adam Optimization algorithm using Numpy
thetechdude124/Adam-Optimization-From-Scratch
📈Implementing the ADAM optimizer from the ground up with PyTorch and comparing its performance on six 3-D objective functions (each progressively more difficult to optimize) against SGD, AdaGrad, and RMSProp.
declanoller/haskell-vae
Learning about Haskell with Variational Autoencoders
jaepil/geometric-adam
A Ray Tracing-Inspired Approach to Neural Network Optimization
SSQ/Coursera-Ng-Improving-Deep-Neural-Networks-Hyperparameter-tuning-Regularization-and-Optimization
Short description for quick search
YashArote/gradient-descent-visualizer
A fast, interactive tool to visualize how different gradient descent algorithms (like vanilla gradient Descent, Momentum, RMSprop, Adam, etc.) navigate complex loss surfaces in real time.
Gunale0926/Grams
Grams: Gradient Descent with Adaptive Momentum Scaling (ICLR 2025 Workshop)
harshalmittal4/Hypergradient_variants
Improved Hypergradient optimizers for ML, providing better generalization and faster convergence.
kcg2015/DDPG_numpy_only
Implemenation of DDPG with numpy only (without Tensorflow)
OptimalFoundation/nadir
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻
shashank1623/Plant-disease-Detection
Plant Disease Detection using convolutional neural network. Our model can easily predict the disease of plants like Potato , Tomato , Pepper Bel and many more in the upcoming version.
NinaadRao/Multilabel-Image-Classification-using-Contractive-Autoencoder
Implementing contractive auto encoder for encoding cloud images and using that encoding for multi label image classification
rsmath/Digit-Recognizer
A project I made to practice my newfound Neural Network knowledge - I used Python and Numpy to train a network to recognize MNIST images. Adam and mini-batch gradient descent implemented
thieu1995/MetaPerceptron
MetaPerceptron: A Standardized Framework For Metaheuristic-Driven Multi-layer Perceptron Optimization
AdamYuan/SimpleNN
a simple neural network
Niranjankumar-c/GradientDescent_Implementation
Implement different variants of gradient descent in python using numpy
Emperor-WS/PyEmber
An Educational Framework Based on PyTorch for Deep Learning Education and Exploration