Pinned Repositories
Deep-Learning-Pizza-Transformer
Welcome to Attention to Shakespeare ! Designed a Pizza structured Transformer to convert modern english language to Shakespearean style. Compared with other Seq2Seq attention models.
Generative-Neural-Networks
RBM + VAE + GAN
Operating-Systems-Pintos-Threads
Operating-Systems-Pintos-UserPrograms
Unsupervised-Learning-using-Autoencoders
K-Means and Gaussian Mixture Model ( GMM ) have been used for clustering the MNIST Fashion Dataset. To address the problem of the large number of features we used Autoencoders for dimension reduction. Autoencoders are used to compress the images in the dataset, however it results in a lossy compression. However, the reduced dimensions of the images results in K-Means and GMM running more efficiently. Results have been compared for all the experiments using the above three concepts.
Neural-Networks-MNIST-Digits
FFNN + CNN
Graphical-Models
Converted a Bayesian Model to Moral Graph, Triangulation Graph and Junction Tree. Performed inference using Variable Elimination and Belief Propagation. Lastly, used MCMC sampling to confirm the joint probability.
Image-stitching-using-RANSAC-algorithm
Keypoints (SIFT descriptors or KAZE descriptors) were used to generate homography matrix using custom RANSAC algorithm. Implementation was done in Python using OpenCV.
Bayesian-Decision-and-SVM
Classified MNIST digits using Bayesian Decision Theory. Implemented Support Vector Machine with 3-fold cross-validation for MNIST digits dataset.
Denoising-Image
Denoising image using median filter
soumita0210's Repositories
soumita0210/Keywords-based-exploration-of-digital-library
soumita0210/Deep-Learning-Pizza-Transformer
Welcome to Attention to Shakespeare ! Designed a Pizza structured Transformer to convert modern english language to Shakespearean style. Compared with other Seq2Seq attention models.
soumita0210/Generative-Neural-Networks
RBM + VAE + GAN
soumita0210/Unsupervised-Learning-using-Autoencoders
K-Means and Gaussian Mixture Model ( GMM ) have been used for clustering the MNIST Fashion Dataset. To address the problem of the large number of features we used Autoencoders for dimension reduction. Autoencoders are used to compress the images in the dataset, however it results in a lossy compression. However, the reduced dimensions of the images results in K-Means and GMM running more efficiently. Results have been compared for all the experiments using the above three concepts.
soumita0210/Graphical-Models
Converted a Bayesian Model to Moral Graph, Triangulation Graph and Junction Tree. Performed inference using Variable Elimination and Belief Propagation. Lastly, used MCMC sampling to confirm the joint probability.
soumita0210/Bayesian-Decision-and-SVM
Classified MNIST digits using Bayesian Decision Theory. Implemented Support Vector Machine with 3-fold cross-validation for MNIST digits dataset.
soumita0210/Neural-Networks-MNIST-Digits
FFNN + CNN
soumita0210/Reinforcement-Learning
Reinforcement Learning is a major paradigm in Machine Learning apart from Supervised and Unsupervised Learning. It is based on the notion of taking actions which maximizes the cumulative reward. This basically teaches automated agents to take actions based on its present state. In this project we are provided with an environment which consists of the agent and the goal. The three major tasks we are told to implement are the Q-Learning, policy determination and the training process. The upcoming sections contains the details of the algorithm and implementation.
soumita0210/Operating-Systems-Pintos-UserPrograms
soumita0210/Operating-Systems-Pintos-Threads
soumita0210/Denoising-Image
Denoising image using median filter
soumita0210/K-Means-Segmentation
Segmenting image based on K-Means Clustering
soumita0210/Image-stitching-using-RANSAC-algorithm
Keypoints (SIFT descriptors or KAZE descriptors) were used to generate homography matrix using custom RANSAC algorithm. Implementation was done in Python using OpenCV.
soumita0210/Template-Matching
soumita0210/Image-Filtering
Image filtering using high-pass and low-pass filter
soumita0210/Neural-Networks-on-MNIST-Fashion-Data
Neural network is a supervised classification model which mimics the structure of human brain. They are a model of interconnected nodes or neurons where one arrow denotes how the output from one node becomes an input for the next. There are several models of neural networks. The most used one are \textit{feed-forward neural networks} and \textit{recurrent neural networks}. In this project we mainly work with various types of \textit{feed-forward neural networks}. Using the MNIST-Fashion data set we show how the results vary for one-hidden layer NN, multi-layer NN and convolution NN.
soumita0210/Logistic-Regression-WDBC-Dataset
One of the most common classification algorithm is regression. Regression can be linear or multivariate depending on the number of features available. Regression maps the input vector to an output using a polynomial or basis function. Logistic regression is useful in determining the class to which the input vector belongs. It is called binary logistic regression when the number of classes is dichotomous(binary). In this report binary logistic regression algorithm is employed to determine the malignancy status of the patients in the Wisconsin Diagnostic Breast Cancer (WDBC) dataset.