dense-layers
There are 13 repositories under dense-layers topic.
malena1906/Pruning-Weights-with-Biobjective-Optimization-Keras
Overparameterization and overfitting are common concerns when designing and training deep neural networks. Network pruning is an effective strategy used to reduce or limit the network complexity, but often suffers from time and computational intensive procedures to identify the most important connections and best performing hyperparameters. We suggest a pruning strategy which is completely integrated in the training process and which requires only marginal extra computational cost. The method relies on unstructured weight pruning which is re-interpreted in a multiobjective learning approach. A batchwise Pruning strategy is selected to be compared using different optimization methods, of which one is a multiobjective optimization algorithm. As it takes over the choice of the weighting of the objective functions, it has a great advantage in terms of reducing the time consuming hyperparameter search each neural network training suffers from. Without any a priori training, post training, or parameter fine tuning we achieve highly reductions of the dense layers of two commonly used convolution neural networks (CNNs) resulting in only a marginal loss of performance. Our results empirically demonstrate that dense layers are overparameterized as with reducing up to 98 % of its edges they provide almost the same results. We contradict the theory that retraining after pruning neural networks is of great importance and opens new insights into the usage of multiobjective optimization techniques in machine learning algorithms in a Keras framework. The Stochastic Multi Gradient Descent Algorithm implementation in Python3 is for usage with Keras and adopted from paper of S. Liu and L. N. Vicente: "The stochastic multi-gradient algorithm for multi-objective optimization and its application to supervised machine learning". It is combined with weight pruning strategies to reduce network complexity and inference time.
swap-253/Twitter-US-Airline-Sentiment-Analysis
In this repository I have utilised 6 different NLP Models to predict the sentiments of the user as per the twitter reviews on airline. The dataset is Twitter US Airline Sentiment. The best models each from ML and DL have been deployed. It employs text preprocessing,
copev313/Chatbot-Using-Deep-Learning
We build a chatbot by implementing machine learning and natural language processing.
its-Kumar/signlang_project
Major Project in Final Year B.Tech (IT). Live Stream Sign Language Detection using Deep Learning.
jarred13/Predicting_Quality_of_Wine
Using data to help us choice high quality wine
komfysach/fraud-detection
Fraud Classification using Deep Learning Techniques
sultanazhari/verify-a-person-s-age-by-the-face
A supermarket chain called Good Seed wanted to see if Data Science could help them comply with the law by ensuring that they did not sell age-restricted products to underage customers. My task was to build and evaluate a model to verify a person's age.
Kalyani011/NLP-Textual_Similarity
Implementation and Comparison of Multiclass Synonyms Equivalence Classifiers based on Textual Similarity Metrics using Keras
RobertRusev/NLP-FinHeadlines-MoodTracker
NLP-FinHeadlines-MoodTracker is a NLP project utilising sentiment analysis on financial news headlines. It employs a combination of CNN and LSTM layers to predict sentiment (positive, negative, neutral). The model incorporates an embedding layer, 1D convolution, max pooling, bidirectional LSTM, dropout, and dense layer for sentiment classification.
hmcalister/MNIST-Machine-Learning-Investigation
A beginner's investigation into the world of neural networks, using the MNIST image dataset
KatameRonin/AutoEncoders
Implementations of different types of AutoEncoders
Ninad077/Deep_Learning-Convolutional_Neural_Networks_CNN
Content: Structure of CNN, Convolutional layer, Pooling layer, Fully connected layer, Dense layer, output, Image classification, Creating, compiling and training the model on epochs, testing the model on gradio