Simple tensorflow implementation of Selective Kernel Networks
If you want to see the original author's code, pls refer to this github link
Version 1.0 : SKNet block without groups and BN params.
Version 1.1 : Set -> BN params. and remove -> fc layer BN ops.
- Version 1.2 will be coming soon.
- Python >= 3.6
- Tflearn >= 0.3.2
- Tensorflow >= 1.9.0
import SKNet
import tensorflow.contrib.slim as slim
...
conv1 = slim.conv2d(inputs, 64, [3, 3], scope='conv1')
conv2 = SKNet(conv1, 3, 2, is_training=True)
conv3 = slim.conv2d(inputs, 3, [3, 3], scope='out')
...
training code
At training stage, you may calculate the moving_mean and moving_var.
update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
with tf.control_dependencies(update_ops):
# Ensures that we execute the update_ops before performing the train_step
train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(loss)
- Learn More :
- Github Issue : Easy to use batch norm layer
- Blog Tips : tf-batch_normalization
Any improvement or bug-fixing is welcome.
Create a pull request or issue when you are done.