Baseline Network Model
johnmartinsson opened this issue · 0 comments
Implement a baseline convolutional neural network model.
"Figure 5 shows a visual representation of our neural network architecture. The network contains 5 convolutional layer, each followed by a max-pooling layer. We insert one dense layer before the final soft-max layer. The dense layer contains 1024 and the soft-max layer 1000 units, generating a probability for each class. We use batch normalization before every convolutional and before the dense layer. The convolutional layers use a rectify activation function. Drop-out is used on the input layer (probability 0.2), on the dense layer (probability 0.4) and on the soft-max layer (probability 0.4). As a cost function we use the single label categorical cross entropy function (in the log domain)."
Architecture
- Dropout 20%
- BachNormalization
- Convolution with 64 5x5 Kernels Stride Size 2x1
- ReLU Activation
- MaxPooling with 2x2 Kernels Stride Size 2x2
Four times:
- BachNormalization
- Convolution Num. Filters = 64, 128, 256, 256
- Convolution Kernel Sizes = 5x5, 5x5, 5x5, 3x3
- Convolution Stride Size = 1x1
- ReLU Activation
- MaxPooling with 2x2 Kernels and Stride Size 2x2
Fully Connected
- Dropout(40%)
- Dense Layer with 1024 units
- Dropout(40%)
- SoftMax Layer with 19 units