The light(`light` means not many codes here) deep learning framework for study and for fun. Join us!
pip install lightnn
python setup.py install
lightnn
├── init.py
├── init.pyc
├── base
│ ├── init.py
│ ├── init.pyc
│ ├── activations.py
│ ├── activations.pyc
│ ├── initializers.py
│ ├── initializers.pyc
│ ├── losses.py
│ ├── losses.pyc
│ ├── optimizers.py
│ └── optimizers.pyc
├── examples
│ ├── NeuralNetwork.py
│ ├── init.py
│ ├── data
│ │ └── tiny_shakespeare.txt
│ ├── lm.py
│ └── mnist.py
├── layers
│ ├── init.py
│ ├── init.pyc
│ ├── convolutional.py
│ ├── convolutional.pyc
│ ├── core.py
│ ├── core.pyc
│ ├── layer.py
│ ├── layer.pyc
│ ├── pooling.py
│ ├── pooling.pyc
│ ├── recurrent.py
│ └── recurrent.pyc
├── models
│ ├── init.py
│ ├── init.pyc
│ ├── models.py
│ └── models.pyc
├── ops.py
├── ops.pyc
└── test
├── init.py
├── cnn_gradient_check.py
├── nn_gradient_check.py
├── rnn_gradient_check.py
└── test_activators.py
- Sequential
- Model
- identity(dense)
- sigmoid
- relu
- softmax
- tanh
- leaky relu
- elu
- selu
- thresholded relu
- softplus
- softsign
- hard sigmoid
- MeanSquareLoss
- BinaryCategoryLoss
- LogLikelihoodLoss
- xavier uniform initializer(glorot uniform initializer)
- default weight initializer
- large weight initializer
- orthogonal initializer
- SGD
- Momentum
- RMSProp
- Adam
- Adagrad
- FullyConnected(Dense)
- Conv2d
- MaxPooling
- AvgPooling
- Softmax
- Dropout
- Flatten
- RNN
- LSTM
- GRU
- MLP MNIST Classification
- CNN MNIST Classification
- RNN Language Model
- LSTM Language Model
- GRU Language Model
- Keras: a polular deep learning framework based on tensorflow and theano.
- NumpyDL: a simple deep learning framework with manual-grad, totally written with python and numpy.([Warning] Some errors in
backward
part of this project) - paradox: a simple deep learning framework with symbol calculation system. Lightweight for learning and for fun. It's totally written with python and numpy.
- Bingtao Han's blogs: easy way to go for deep learning([Warning] Some calculation errors in
RNN
part).