Python script to generate prototxt on Caffe, specially the inception_v3\inception_v4\inception_resnet\fractalnet
The prototxts can be visualized by ethereon.
Every model has a bn (batch normalization) version (maybe only bn version), the paper is Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
-
Lenet-5 (lenet.py)
Lenet-5 was presented by Yann LeCun in Backpropagation applied to handwritten zip code recognition.
-
AlexNet (and caffenet in alexnet.py)
AlexNet was initially described in [ImageNet Classification with Deep Convolutional Neural Networks] (http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf)
Implemention of CaffeNet is referenced by caffe/caffenet.py
-
Network in network (nin.py)
NIN model was described in Network In Network
-
Inception_v1 (inception_v1.py)
Inception conception was described in Going Deeper with Convolutions
-
VggNet (vggnet.py)
Vgg presented the network in Very Deep Convolutional Networks for Large-Scale Image Recognition
The implemention of vgg_11a,vgg_11a_bn,vgg_16c,vgg_16c_bn are in vggnet.py
-
Inception_v3 (inception_v3.py)
Inception_v3 is the improved version of inception_v1, the details are described in Rethinking the Inception Architecture for Computer Vision
-
Inception_v4 (inception_resnet.py)
Inception_v4 is is a more uniform simplified architecture and more inception modules than Inception-v3, the details are described in Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning
-
Inception_resnet (inception_resnet.py)
Inception_resnet_v2 combines the residual connections and the latest revised version of the Inception architecture, single crop-single model top-5 error of inception_resnet_v2 is 4.9% on the non-blacklisted subset of the validation set of ILSVRC 2012. The details are described in Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning
-
ResNet
Coming soon ......
I greatly thank Yangqing Jia and BVLC group for developing Caffe
And I would like to thank all the authors of every cnn model