Neil-Shah
A speech programmer enthusiast using traditional signal processing and deep learning techniques
Gurugram, India
Pinned Repositories
GANs-for-Speech-Enhancement
Generative Adversarial Network implemented for the Time-Frequency based Speech Enhancement
Classification-IRIS-
Classification results on IRIS dataset is reported using 10-fold cross validation. Among 150 examples, some flowers are miss-classified as can be understood from the accuracy results. The code is implemented on matlab.
fairseq-exps
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
mnist-CNN
Trains CNN for MNIST dataset. Once the weights and activations are learned, the summaries gets created in LOGDIR path (mentioned in code), and can be visualized on Tensorboard.
Regression-linear-nonlinear
A single layer neural network with two input neurons is trained for predicting a linear equation y=ax+b., and a two layer neural network with 4 input neurons, 9 hidden neurons and 3 output neurons is trained for nonlinear prediction 0f given 3 nonlinear equations.
speech-resynthesis-vocoder
An official reimplementation of the method described in the INTERSPEECH 2021 paper - Speech Resynthesis from Discrete Disentangled Self-Supervised Representations.
Tennis-Refactoring-Kata
Starting code for a Refactoring Code Kata on the Tennis rules
tts
P2E
StethoSpeech
Neil-Shah's Repositories
Neil-Shah/GANs-for-Speech-Enhancement
Generative Adversarial Network implemented for the Time-Frequency based Speech Enhancement
Neil-Shah/fairseq-exps
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Neil-Shah/Tennis-Refactoring-Kata
Starting code for a Refactoring Code Kata on the Tennis rules
Neil-Shah/speech-resynthesis-vocoder
An official reimplementation of the method described in the INTERSPEECH 2021 paper - Speech Resynthesis from Discrete Disentangled Self-Supervised Representations.
Neil-Shah/Regression-linear-nonlinear
A single layer neural network with two input neurons is trained for predicting a linear equation y=ax+b., and a two layer neural network with 4 input neurons, 9 hidden neurons and 3 output neurons is trained for nonlinear prediction 0f given 3 nonlinear equations.
Neil-Shah/Classification-IRIS-
Classification results on IRIS dataset is reported using 10-fold cross validation. Among 150 examples, some flowers are miss-classified as can be understood from the accuracy results. The code is implemented on matlab.
Neil-Shah/mnist-CNN
Trains CNN for MNIST dataset. Once the weights and activations are learned, the summaries gets created in LOGDIR path (mentioned in code), and can be visualized on Tensorboard.