// For latest version (unstable) clone the dev branch.
- Implementing A Generic API For Creating Different Types Of Neural Networks
- Implementing Different Cost And Activation Functions
- Implementing Different Optimzation Methods
- Implementing SVM
- Implementing K-means
- Activation Functions : relu, softmax, sigmoid, tanh
- Cost Functions : cross entropy, quadratic cost
- Optimizers : Gradient Descent (batch, mini batch and stochastic), RMSProp, Adam
- Network Types : Feed Forward
- Fixing All Bugs
- Convolutional and reccurent networks
- Weight Initializations
- Regularization Techniques
- Faster API Performance
- GPU Support
@abtExp
--Any help or suggestion is appreciated.