Simple MLP - NeuralNetwork Library For Microcontrollers
Nothing "Import ant", just a simple library for implementing Neural-Networks(NNs) easily and effectively on any Arduino board and other microcontrollers.
NN Functions | Input Type (x) | Output Type (Y) | Action |
---|---|---|---|
*FeedForward(X) |
1D Float Array | 1D Float Array | "Feeds" the NN with X-input values and returns Y-Output Values, If needed. |
*BackProp(x) |
1D Float Array | 1D Float Array | Tells to the NN if the outputs-Y were right/the-expected-X-inputs and then, teaches it. |
Examples: Backpropagation_Single_Xor_Gate.ino
Understanding the Basics of a Neural Network:
0
1
2
3
4
5
6
7
8
9
10
11
12
.
An Image that Explains some basic things.
-
+
Optimized Algorithm For Less SRAM Usage.
-
+
Only Sigmoid Activation Function.
-
+
Use of PROGMEM.
-
+
Simplicity.
-
-
Maybe Usage of External EEPROM.
-
-
Precise properties, for many different needs.
- ATtiny85 doesn't have FPU that makes Maths on it, "difficult" for the SRAM
- If you want to Use "Serial" On An ATtiny85 Click Here (Be Careful SoftwareSerial Uses A lot of SRAM)
- If you have Error with 'POINTER_REGS' Click Here
- Backprop maths on An ATtiny85 doesn't work properly for some reasons, though Feed Forword maths Work! [...]
- I am not yet a professional programmer [...]
- Make sure that you have used (4-byte)(32-bit)-precision variables when Training, Because Floats:"...are stored as 32 bits (4 bytes) of information...get more precision by using a double (e.g. up to 15 digits), on the Arduino, double is the same size as float."
- Arduino Uno
- ATtiny85
* | Example Files (.ino) | Explenation |
---|---|---|
1 | Backpropagation_double_Xor |
NeuralNetwork Training of a 3-input-xor circuit and the print of the output(s) of it |
2 | Backpropagation_Single_Xor_Gate |
NeuralNetwork Training of a xor gate and the print of the output(s) of it |
3 | FeedForward_double_Xor |
print of the outputs of the pre-trained-NN |
4 | FeedForward_double_Xor_PROGMEM |
print of the outputs of the pre-trained-NN using weights and biases from ROM |
NN = NeuralNetwork , LR = Learning Rate
NN Functions | Input Type (x),(z) | Output Type (Y) | Action |
---|---|---|---|
*FeedForword(*inputs ) |
1D Float Array | 1D Float Array | "Feeds" the NN with X-input values and returns Y-Output Values, If needed. |
*FeedForword(*inputs , IS_PROGMEM ) |
1D Float Array, Boolean | 1D Float Array | "Feeds" the NN with X-input values and returns Y-Output Values, If needed. [Weights And Biases Saved in ROM] |
BackProp(*expected ) |
1D Float Array | 1D Float Array | Tells to the NN if the outputs-Y were right/the-expected-X-inputs and then, teaches it. |
print(IS__PROGMEM ) |
Boolean | String | Serial.Prints the weights and biases of NN. If print(true) prints from ROM |
NN.Layer[ i ].Sigmoid(&x ) |
Constant Float | Float | Returns Sigmoid Activation Function's 1/(1+e^(-x)) value |
NN -Constructors -Variables -Layer's Variables | Type | Explenation |
---|---|---|
NeuralNetwork(*_layer , &NumberOflayers ) |
const unsigned int , |
Constructor |
NeuralNetwork(*_layer , &NumberOflayers , &LRw , &LRb ) |
const unsigned int , ,const float , |
Constructor |
NeuralNetwork(*_layer , *default_Weights , *default_Bias , &NumberOflayers ) |
const unsigned int ,float , ,const unsigned int |
Constructor |
NeuralNetwork(*_layer , *default_Weights , *default_Bias , &NumberOflayers ,NO_OUTPUTS ) |
const unsigned int ,float , ,const unsigned int ,bool |
Constructor NO_OUTPUTS Clears Outputs from RAM |
NN.LearningRateOfWeights |
float |
- |
NN.LearningRateOfBiases |
float |
- |
NN.layers [ i ] |
Layer* |
- |
NN.Layer[ i ].bias |
float* |
- |
NN.Layer[ i ].outputs [ j ] |
float* |
- |
NN.Layer[ i ].weights [ j ][ l ] |
float** |
- |
NN.Layer[ i ].preLgamma [ j ] |
float* |
- |
NN.layers[ i ]._numberOfInputs |
unsigned int |
ReadOnly |
NN.layers[ i ]._numberOfOutputs |
unsigned int |
ReadOnly |
|
| Intresting |NN.
| Neural Network(s) |A.
| Arduino etc. |-
| Mostly .NET & Other |*
| Maybe Intresting?
Forgive me for my mistakes and maybe poor knowledge in C/C++, but it is also my first time making a "normal" library [...]
I am also sorry for my randomness in some parts of the Searches Across Internet
Section.
Εικόνα που βλέπει το "εγώ", θυμίζει κουτί που δε βλέπεις, εικόνα που χάνεις• σκιά που 'τε φως και να βλέπεις, ακούει τα λόγια που λες: [...] #i📁👁
I wish love and happiness to Everyone! <3