Go Concurrent Neural Network
- Put MNISTTranslate, NeuroNet, and the Bash- and/or Batchfile into your GOROOT directory
- Download the MNIST Dataset
- Put the files into the ./MNISTTranslate directory
- Run the program
- Linux: ./BakkBash
- Windows: ./BakkBatch.cmd
- Change ./NeuroNet/Batchfile/Batch to your specifications
- Modify how your network should look like (example in ./NeuroNet/Data/Networks/Numbers)
- First time (or after changes) you need to build it
- MNISTTranslate: go build ./MNISTTranslate/MNISTTranslate.go
- NeuroNet: go build ./NeuroNet/NeuroNet.go
- Run MNISTTranslate
- Put the translated files into your specified folders
- Standard Test folder: ./NeuroNet/Data/Test
- Standard Train folder: ./NeuroNet/Data/Train
- Run NeuroNet
Each line in the batch file represents one action which the software will execute.
- ResultFile: → Path to the result output
- NetworkFile: → Path to the network creation file
- PersistenceFile: → Path to the persistence
- TrainFile: → Path to the training data
- TestFile: → Path to the test data
- PreProcessing: → None / MeanSubstraction / Proportional
- Parallel: → Number of goroutines
- WorkerBatch: → Batch size for each worker before result merge
- LearningRate: → Rate of learning
- Lambda: → Lambda value for elastic net regularization
- MinWeight: → Minimum weight for connectome initialization
- MaxWeight: → Maximum weight for connectome initialization
- TargetColumnsStart: → First field that is a target value
- TargetColumnsEnd: → First field that isn’t a target value
- Train: → Train the network x times
- Test: → Run a test x times
Each line represents one layer in the network.
Schema: Activation,Neurons
Current activation functions:
- Identity
- Logistic
- TanH
- ReLU
- LeakyReLU
- ELU
- SoftMax
In example "SoftMax, 10" (without quotation marks) creates one SoftMax layer with 10 neurons. Currently a bias neuron will be added to every but the last layer.