Toy Neural Neworks from scratch
- Interface designed is pretty simple, even simpler than Keras
- Implemented with Pure Python and Numpy
- All operators are vectorized
- Easy to support more Layers/Operators
- Layer: FullyConnected, BatchNorm, Dropout
- Activation: ReLU, Sigmoid, Tanh, LeakyReLU, ELU, Softmax
- Loss: L1 Loss, Cross Entropy, L2 Loss
- Optimizer: SGD, Momentum, Adam
# model
model = network.Network()
# optional: define intializer for weights
initializer = parameter.GaussianInitializer(std=0.1)
bias_initializer = parameter.ConstantInitializer(0.1)
# add layers
model.add(
FullyConnected(
name='fc1',
in_feature=784,
out_feature=512,
weight_initializer=initializer,
bias_initializer=bias_initializer))
model.add(
BatchNorm(name='bn1',
num_features=512)
)
model.add(ReLU(name='relu1'))
# define loss function
model.add_loss(CrossEntropyLoss())
# define optimizer
optimizer = Momentum(lr=lr, momentum=0.9)
# provide data
batch_images = ...
batch_labels = ...
# train
output, loss = model.forward(batch_images, batch_labels)
model.optimize(optimizer)
# to freeze BatchNorm or Dropout
model.test_mode()
This project is under the MIT Licence