stochastic-gradient-descent

There are 394 repositories under stochastic-gradient-descent topic.

  • Suji04/ML_from_Scratch

    Implementation of basic ML algorithms from scratch in python...

    Language:Jupyter Notebook28461229
  • machine-learning

    je-suis-tm/machine-learning

    Python machine learning applications in image processing, recommender system, matrix completion, netflix problem and algorithm implementations including Co-clustering, Funk SVD, SVD++, Non-negative Matrix Factorization, Koren Neighborhood Model, Koren Integrated Model, Dawid-Skene, Platt-Burges, Expectation Maximization, Factor Analysis, ISTA, FISTA, ADMM, Gaussian Mixture Model, OPTICS, DBSCAN, Random Forest, Decision Tree, Support Vector Machine, Independent Component Analysis, Latent Semantic Indexing, Principal Component Analysis, Singular Value Decomposition, K Nearest Neighbors, K Means, Naïve Bayes Mixture Model, Gaussian Discriminant Analysis, Newton Method, Coordinate Descent, Gradient Descent, Elastic Net Regression, Ridge Regression, Lasso Regression, Least Squares, Logistic Regression, Linear Regression

    Language:Jupyter Notebook2295151
  • hiroyuki-kasai/SGDLibrary

    MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20

    Language:MATLAB21919985
  • gyrdym/ml_algo

    Machine learning algorithms in Dart programming language

    Language:Dart18954632
  • lixilinx/psgd_torch

    Pytorch implementation of preconditioned stochastic gradient descent (affine group preconditioner, low-rank approximation preconditioner and more)

    Language:Python128738
  • aditya9211/Blur-and-Clear-Classification

    Classifying the Blur and Clear Images

    Language:Python12271124
  • hiroyuki-kasai/RSOpt

    Riemannian stochastic optimization algorithms: Version 1.0.3

    Language:MATLAB645023
  • mynkpl1998/Deep-Learning-Optimization-Algorithms

    Visualization of various deep learning optimization algorithms using PyTorch automatic differentiation and optimizers.

    Language:Jupyter Notebook56307
  • DylanMuir/fmin_adam

    Matlab implementation of the Adam stochastic gradient descent optimisation algorithm

    Language:Matlab555425
  • polyfem/polysolve

    Easy-to-use linear and non-linear solver

    Language:C++4891016
  • mahdihosseini/RMSGD

    Exploiting Explainable Metrics for Augmented SGD [CVPR2022]

    Language:Python454013
  • ChunyuanLI/pSGLD

    AAAI & CVPR 2016: Preconditioned Stochastic Gradient Langevin Dynamics (pSGLD)

    Language:Matlab352412
  • qandeelabbassi/python-svm-sgd

    Python implementation of stochastic sub-gradient descent algorithm for SVM from scratch

    Language:Python353122
  • hiroyuki-kasai/OLSTEC

    OnLine Low-rank Subspace tracking by TEnsor CP Decomposition in Matlab: Version 1.0.1

    Language:MATLAB344013
  • lixilinx/psgd_tf

    Tensorflow implementation of preconditioned stochastic gradient descent

    Language:Python343314
  • xcsf-dev/xcsf

    XCSF learning classifier system: rule-based online evolutionary machine learning

    Language:C2943912
  • sibirbil/SMB

    Stochastic gradient descent with model building

    Language:Python25103
  • harshraj11584/Paper-Implementation-Overview-Gradient-Descent-Optimization-Sebastian-Ruder

    [Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder

    Language:Python23206
  • gcampanella/pydata-london-2018

    Slides and notebooks for my tutorial at PyData London 2018

    Language:Jupyter Notebook22406
  • Vercaca/NN-Backpropagation

    Implement a Neural Network trained with back propagation in Python

    Language:Python22314
  • wajidarshad/LUPI-SVM

    SVM with Learning Using Privileged Information (LUPI) framework

    Language:Python22325
  • evarae/CNN_Tutorial

    Hi! Thanks for checking out my tutorial where I walk you through the process of coding a convolutional neural network in java from scratch. After building a network for a university assignment, I decided to create a tutorial to (hopefully) help others do the same and improve my own understanding of neural networks.

    Language:Java20204
  • rafi007akhtar/Digit-Classifier

    Wrote a neural network that uses fundamental DL algorithms to identify handwritten digits from MNIST dataset.

    Language:Jupyter Notebook204012
  • bhattbhavesh91/gradient-descent-variants

    My implementation of Batch, Stochastic & Mini-Batch Gradient Descent Algorithm using Python

    Language:Jupyter Notebook193021
  • HegdeChaitra/Yelp-Recommendation-System

    Recommend Restaurants to User based on the ratings given by them to the restaurants

    Language:Jupyter Notebook19107
  • Cr4ckC4t/neural-network-from-scratch

    A basic neural network with backpropagation programmed from scratch in C++

    Language:C++18307
  • hiroyuki-kasai/SimpleDeepNetToolbox

    Simple MATLAB toolbox for deep learning network: Version 1.0.3

    Language:MATLAB162110
  • chenpf1025/SLN

    ICLR 2021: Noise against noise: stochastic label noise helps combat inherent label noise

    Language:Python14112
  • kcg2015/DDPG_numpy_only

    Implemenation of DDPG with numpy only (without Tensorflow)

    Language:Python14205
  • mmahesh/variants-of-rmsprop-and-adagrad

    SC-Adagrad, SC-RMSProp and RMSProp algorithms for training deep networks proposed in

    Language:Python14333
  • hpca-uji/PyDTNN

    PyDTNN - Python Distributed Training of Neural Networks

    Language:Python13324
  • saadlabyad/aslsd

    Parametric estimation of multivariate Hawkes processes with general kernels.

    Language:Python13212
  • Adamdad/Filter-Gradient-Decent

    In this paper, we propose Filter Gradient Decent (FGD), an efficient stochastic optimization algorithm that makes a consistent estimation of the local gradient by solving an adaptive filtering problem with different designs of filters.

    Language:Python11302
  • ldv1/bbvi_spike_and_slab

    Black-box spike and slab variational inference, example with linear models

    Language:Python11201
  • ttungl/Deep-Learning

    Implemented the deep learning techniques using Google Tensorflow that cover deep neural networks with a fully connected network using SGD and ReLUs; Regularization with a multi-layer neural network using ReLUs, L2-regularization, and dropout, to prevent overfitting; Convolutional Neural Networks (CNNs) with learning rate decay and dropout; and Recurrent Neural Networks (RNNs) for text and sequences with Long Short-Term Memory (LSTM) networks.

    Language:Jupyter Notebook11406