l1-regularization

There are 45 repositories under l1-regularization topic.

  • foolwood/pytorch-slimming

    Learning Efficient Convolutional Networks through Network Slimming, In ICCV 2017.

    Language:Python57591697
  • carnotresearch/cr-sparse

    Functional models and algorithms for sparse signal processing

    Language:Jupyter Notebook9551611
  • rfeinman/pytorch-lasso

    L1-regularized least squares with PyTorch

    Language:Python68238
  • sandipanpaul21/Logistic-regression-in-python

    Logistic Regression technique in machine learning both theory and code in Python. Includes topics from Assumptions, Multi Class Classifications, Regularization (l1 and l2), Weight of Evidence and Information Value

    Language:Jupyter Notebook18008
  • mansipatel2508/Network-Intrusion-Detection-with-Feature-Extraction-ML

    The given information of network connection, model predicts if connection has some intrusion or not. Binary classification for good and bad type of the connection further converting to multi-class classification and most prominent is feature importance analysis.

    Language:Jupyter Notebook11110
  • Image-Reconstructor-FISTA-proximal-method-on-wavelets-transform

    EliaFantini/Image-Reconstructor-FISTA-proximal-method-on-wavelets-transform

    An Image Reconstructor that applies fast proximal gradient method (FISTA) to the wavelet transform of an image using L1 and Total Variation (TV) regularizations

    Language:Python10200
  • malena1906/Pruning-Weights-with-Biobjective-Optimization-Keras

    Overparameterization and overfitting are common concerns when designing and training deep neural networks. Network pruning is an effective strategy used to reduce or limit the network complexity, but often suffers from time and computational intensive procedures to identify the most important connections and best performing hyperparameters. We suggest a pruning strategy which is completely integrated in the training process and which requires only marginal extra computational cost. The method relies on unstructured weight pruning which is re-interpreted in a multiobjective learning approach. A batchwise Pruning strategy is selected to be compared using different optimization methods, of which one is a multiobjective optimization algorithm. As it takes over the choice of the weighting of the objective functions, it has a great advantage in terms of reducing the time consuming hyperparameter search each neural network training suffers from. Without any a priori training, post training, or parameter fine tuning we achieve highly reductions of the dense layers of two commonly used convolution neural networks (CNNs) resulting in only a marginal loss of performance. Our results empirically demonstrate that dense layers are overparameterized as with reducing up to 98 % of its edges they provide almost the same results. We contradict the theory that retraining after pruning neural networks is of great importance and opens new insights into the usage of multiobjective optimization techniques in machine learning algorithms in a Keras framework. The Stochastic Multi Gradient Descent Algorithm implementation in Python3 is for usage with Keras and adopted from paper of S. Liu and L. N. Vicente: "The stochastic multi-gradient algorithm for multi-objective optimization and its application to supervised machine learning". It is combined with weight pruning strategies to reduce network complexity and inference time.

    Language:Python9202
  • JasonSloan/yolov8-prune

    Yolov8-pruning based on constraint of BN layer gamma values.

    Language:Python5110
  • jaydu1/SparsePortfolio

    High Dimensional Portfolio Selection with Cardinality Constraints

    Language:Python5103
  • Arijit-datascience/CNN_BatchNormalization_Regularization

    MNIST Digit Prediction using Batch Normalization, Group Normalization, Layer Normalization and L1-L2 Regularizations

    Language:Jupyter Notebook3101
  • DolbyUUU/regression_algorithm_implementation_python

    Regression algorithm implementaion from scratch with python (OLS, LASSO, Ridge, robust regression)

    Language:Python2100
  • ivannz/l1_tf

    A wrapper for L1 trend filtering via primal-dual algorithm by Kwangmoo Koh, Seung-Jean Kim, and Stephen Boyd

    Language:C2012
  • aliyzd95/Optimization-and-Regularization-from-scratch

    Implementation of optimization and regularization algorithms in deep neural networks from scratch

    Language:Python1100
  • Arek-KesizAbnousi/LASSO-vs-SCAD-vs-MCP_Estimators

    Comparing Three Penalized Least Squares Estimators: LASSO,SCAD and MCP.

    Language:R1100
  • federicoarenasl/Regularization-techniques-on-NNs

    During this study we will explore the different regularisation methods that can be used to address the problem of overfitting in a given Neural Network architecture, using the balanced EMNIST dataset.

    Language:Jupyter Notebook1000
  • giamuhammad/neuralnet_optimization_based_on_ga

    Forecasting for AirQuality UCI dataset with Conjugate Gradient Artificial Neural Network based on Feature Selection L1 Regularized and Genetic Algorithm for Parameter Optimization

    Language:Jupyter Notebook1111
  • zhangyongheng78/Mathematical-Machine-Learning-Algorithm-Implementations

    Mathematical machine learning algorithm implementations

    Language:Python1100
  • ayan-chattaraj/house_price_prediction

    I executed this assignment for a US-based housing company named Surprise Housing, wherein a regression model with regularisation was used to predict the actual value of the prospective properties and decide whether to invest in them or not

    Language:Jupyter Notebook0100
  • darshil2848/House-Price-Prediction

    House Price Analysis and Sales Price Prediction

    Language:Jupyter Notebook0100
  • Manjurkreddy/EndtoendML

    Using this ML file can anyone develop any machine learning alogitham in regression. almost 99.99 percent all regression models has been used. We need to pick the best machine learning algoritham using this file. Cheers Manju.

    Language:Jupyter Notebook0100
  • pjkresearch/Sorted-L1-Norm

    Minimum working example for using the Sorted L1 Norm in a regression and mean-variance framework. The codes are free to use for research purposes only with the propper citation. Commercial use is strictly forbidden and the rights remain with the authors. For citing purposes please refer to the JBF version: https://www.sciencedirect.com/science/article/abs/pii/S0378426619302614

    Language:MATLAB0101
  • saisriramyerubandi/LogisticRegression

    Classification Using Logistic Regression by Making a Neural Network Model. This project also includes comparison of Model performance when different regularization techniques are used

    Language:Jupyter Notebook0101
  • SakshithReddyChintala/Multiclass_logistics_classification_pipeline

    Multiclass Logistic, Classification Pipeline, Cross Validation, Gradient Descent, Regularization

    Language:Jupyter Notebook0100
  • Tim907/oblivious_sketching_varreglogreg

    This is the accompanying code repository for the ICLR 2023 publication "Almost Linear Constant-Factor Sketching for 𝓁₁ and Logistic Regression" by Alexander Munteanu, Simon Omlor and David P. Woodruff.

    Language:Jupyter Notebook0100
  • barisgudul/Ridge_vs_Lasso_Analysis

    This project compares the effects of Ridge (L2) and Lasso (L1) regression models on clinical data.

    Language:Jupyter Notebook
  • DavidAlmagro/WineQuality_Regularization

    Predicting wine quality scores using regression models with L1 and L2 regularisation techniques

    Language:Jupyter Notebook
  • devosmitachatterjee2018/Statistical_Learning_for_Big_Data_Report12062020

    The project encompasses the statistical analysis of a high-dimensional data using different classification, feature selection, clustering and dimension reduction techniques.

    Language:Python10
  • FabrizioMusacchio/L1_and_L2_regularization

    This repository contains the code for the blog post on Understanding L1 and L2 regularization in machine learning. For further details, please refer to this post.

    Language:Python10
  • hwixley/EMNIST-NeuralNet-Regularisation-Experiments

    A study of the problem of overfitting in deep neural networks, how it can be detected, and prevented using the EMNIST dataset. This was done by performing experiments with depth and width, dropout, L1 & L2 regularization, and Maxout networks.

    Language:Jupyter Notebook10
  • hwixley/MLP-coursework1-report

    Machine Learning Practical - Coursework 1 Report: a study of the problem of overfitting in deep neural networks, how it can be detected, and prevented using the EMNIST dataset. This was done by performing experiments with depth and width, dropout, L1 & L2 regularization, and Maxout networks.

    Language:TeX10
  • jianninapinto/COVID-19-Detection-using-Multilayer-Perceptron-Neural-Network

    Used a Multilayer Perceptron (MLP) neural network to detect COVID-19 in lung scans.

    Language:Jupyter Notebook10
  • saritha28/Logistic-Regression

    Implementation of Logistic Regression with L1 Regularization from scratch

    Language:Jupyter Notebook10
  • tuhinaprasad28/Multiclass-Logistics-and-Classification-Pipelines

    Multiclass Logistic, Classification Pipeline, Cross Validation, Gradient Descent, Regularization

    Language:Jupyter Notebook10
  • unnatibshah/Time-Series-Classification-Part2

    Time Series Classification Part 2 Binary and Multiclass Classification. An interesting task in machine learning is classification of time series. In this problem, we will classify the activities of humans based on time series obtained by a Wireless Sensor Network.

    Language:Jupyter Notebook10
  • yixiao-zeng/DKLasso

    This repository is for the development version of DKLasso and DKLasso+.

    Language:Jupyter Notebook