mini-batch-gradient-descent
There are 69 repositories under mini-batch-gradient-descent topic.
gyrdym/ml_algo
Machine learning algorithms in Dart programming language
aditya9211/SVHN-CNN
Google Street View House Number(SVHN) Dataset, and classifying them through CNN
bhattbhavesh91/gradient-descent-variants
My implementation of Batch, Stochastic & Mini-Batch Gradient Descent Algorithm using Python
SSQ/Coursera-Ng-Improving-Deep-Neural-Networks-Hyperparameter-tuning-Regularization-and-Optimization
Short description for quick search
SSQ/Coursera-UW-Machine-Learning-Classification
Notebook for quick search
Coursera-Students-Community/Deep-Learning-Specialization
Coursera - Deep Learning Specialization - deeplearning.ai
ChanchalKumarMaji/Coursera-Deep-Learning-Specialization-deeplearning.ai
[Coursera] Deep Learning Specialization on Coursera
DunittMonagas/Improving-Deep-Neural-Networks-Hyperparameter-tuning-Regularization-and-Optimization
Curso Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization. Segundo curso del programa especializado Deep Learning. Este repositorio contiene todos los ejercicios resueltos. https://www.coursera.org/learn/neural-networks-deep-learning
aditya9211/MNIST-Classification-with-NeuralNet
MNIST Handwritten Digits Classification using 3 Layer Neural Net 98.7% Accuracy
kaustubholpadkar/Predicting-House-Price-using-Multivariate-Linear-Regression
Predicting House Price from Size and Number of Bedrooms using Multivariate Linear Regression in Python from scratch
mahendranandi/Optimization-Algorithm
For learning, visualizing and understanding the optimization techniques and algorithms.
MoinDalvs/Gradient_Descent_For_beginners
Gradient_descent_Complete_In_Depth_for beginners
NvsYashwanth/machinelearningmaster
All about machine learning
SawyerAlston/MNIST-NN-Pure-Math
A "from-scratch" 2-layer neural network for MNIST classification built in pure NumPy, featuring mini-batch gradient descent, momentum, L2 regularization, and evaluation tools — no ML libraries used.
flowstateeng/Coursera-Deep-Learning
A five-course specialization covering the foundations of Deep Learning, from building CNNs, RNNs & LSTMs to choosing model configurations & paramaters like Adam, Dropout, BatchNorm, Xavier/He initialization, and others.
longtng/Stochastic-Gradient-Descent
The laboratory from CLOUDS Course at EURECOM
lucassa3/PEGASOS-SVM-CLASSIFIER
Implementation of a support vector machine classifier using primal estimated sub-gradient solver in C++ and CUDA for NVIDIA GPUs
mstrand1/Jax-logistic-regression
Logistic regression using JAX to support GPU acceleration
annieyan/Stochastic_dual_coordinate_ascent
classify mnist datasets using ridge regression, optimize the algorithem with SGD, stochastic dual coordinate ascent, and mini-batching
chen-bowen/Deep_Neural_Networks
This project explored the Tensorflow technology, tested the effects of regularizations and mini-batch training on the performance of deep neural networks
DeepraMazumder/Social-Network-Ads-Prediction-Analysis
A Machine Learning project to predict user interactions with social network ads using demographic data to optimize ad targeting
ehtisham-sadiq/Numerical-optimization-algorithms-in-Machine-Learning
Exploring and Implementing Numerical Optimization Algorithms in Machine Learning, with Python code and mathematical insights.
hager51/Numerical-Optimization
Numerical Optimization for Machine Learning & Data Science
itsmesatwik/deep-learning
Various methods for Deep Learning, SGD and Neural Networks.
jnclink/Custom-NN
Custom implementation of a neural network from scratch using Python
lam1aa/Intent-classification-with-FNN
Implementation of a Feedforward Neural Network for intent classification using only Python and NumPy. The model classifies user intents from text input using the Sonos NLU Benchmark dataset.
mmaric27/BasicDNN
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
niaj-a/Machine-Learning
Your all-in-one Machine Learning resource – from scratch implementations to ensemble learning and real-world model tuning. This repository is a complete collection of 25+ essential ML algorithms written in clean, beginner-friendly Jupyter Notebooks. Each algorithm is explained with intuitive theory, visualizations, and hands-on implementation.
nisheethjaiswal/ROLLING-DOWN-A-CROWDED-VALLEY-OF-OPTIMIZERS-DEVELOPMENTS-FROM-SGD
Deep Learning Optimizers
Sayed-Hossein-Hosseini/Mountaineers_in_Search_of_the_Global_Minimum
Two mountaineers search for the global minimum of a cost function using different approaches. One represents Stochastic Gradient Descent, taking small, random steps, while the other follows Batch Gradient Descent, making precise moves after full evaluation. This analogy illustrates key optimization strategies in machine learning.
shulavkarki/ANN-Scratch-In-MnsitDigit
ANN Classifier built from scratch used to classify MNIST Digit.
SRI-PRIYAN/MachineLearning
Implementing ML Algorithms using Python and comparing with Standard Library functions
tejaswidabas123/albalone-age-prediction
🐚 Abalone Age Prediction: Dive into Data, Surf on Insights! 📊 Unleash the power of predictive analytics on abalone age estimation! From meticulous data exploration to a showdown of optimization methods, this repo is your gateway to accurate age predictions using physical measurements using Pysaprk. 🌊🔮
WHMHammer/robust-mini-batch-gradient-descent
Robust Mini-batch Gradient Descent models
Raafat-Nagy/Implementations_of_ML_and_DL_Optimizers
This repository provides implementations of numerical optimization algorithms for machine learning and deep learning. It includes clear explanations, mathematical formulas, Python code, and visualizations to help understand the behavior of each optimizer.