/Neural-Network-Multilanguages

implement Artificial Neural Network on different languages

Primary LanguagePHPMIT LicenseMIT

Neural-Network-Multilanguages

implement Gradient Descent Feed-forward and Recurrent Neural Network on different languages, only use vector / linear algebra library.

Artificial Neural Network is relatively easy if you really understand it!

Support

Ruby

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Python

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Javascript

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Go

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

C++

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Julia

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

PHP

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Instructions

  1. Go to any language folder.
  2. run install.sh
  3. run the program.

Neural Network Architectures

  1. Feed-forward Neural Network to predict Iris dataset.
  • 3 layers included input and output layer
  • first 2 layers squashed into sigmoid function
  • last layer squashed into softmax function
  • loss function is cross-entropy
  1. Vanilla Recurrent Neural Network to generate text.
  • 1 hidden layer
  • tanh as activation function
  • softmax and cross entropy combination for derivative
  • sequence length = 15
  1. Vanilla Recurrent Neural Network to predict TESLA market.
  • 1 hidden layer
  • tanh as activation function
  • mean square error for derivative
  • sequence length = 5

All implemention like max(), mean(), softmax(), cross_entropy(), sigmoid() are hand-coded, no other libraries.

Status

Will update overtime.

Warning

You would not see high accuracy for other languages that natively are not using float64. During backpropagation, the changes are very small, float32 ignored it.

Authors