/Deep-Learning-DBN-From-the-ground-up

This is a multiple file project that is part of my dissertation M.Sc Thesis. This repository contains all the background theory of Deep Belief Networks and Machine Learning theory using Artificial Neural Networks to utilize the examples.

Primary LanguageMATLABOtherNOASSERTION

The current Msc. Thesis is dealing with the study and implementation of Deep Belief Networks, both in theoretical and practical background. Our aim is to investigate and analyze the theoretical background of Deep Belief Networks, starting with machine learning theory in the field of Artificial Neural Networks and completing the implementation in an algorithmic layer. The approach that is used for the proper training procedure, includes the Greedy-Layer Wise Unsupervised Pre-Training and Semi-Supervised Fine-Tuning techniques. These techniques contain initialization and optimization procedures of the synaptic weights, using a small part of database training patterns. Studding Deep Belief Networks, we analyze all the methods which contribute to the Deep Learning Network structure, specifying the individual techniques they contain. Recounting, we develop the theory which includes the Metropolis – Hasting, Gibbs Sampling and Simulated Annealing techniques, such as their origins. Based on unsupervised learning Hopfield Network and by using the above techniques, we extract the stochastic form of Boltzmann Machines Networks. Simplifying the morphological structure of the network, we managed to capture high order regularities of the probability density function of the input patterns, by using Restricted Boltzmann Machines Networks and by stacking them we conclude to the final structure of Deep Belief Networks. As a Proof-Of-Concept, we proceed to the development of a Deep Belief Network, extracting the results, by using handwritten digit binomial MNIST database. Summarizing, we attach the recent techniques that are used in the specific database, including the recognition error results. Finally, we present our algorithm results through Matlab IDE, extracting the final In-Sample and Out-Of-Sample errors in the rates of 0.19% and 1.7%, respectively.