How to change the batch size dynamically?
chwangaa opened this issue · 7 comments
In pycaffe particular
Is there any ways we can change the batch_size dynamically without modifying the prototxt file ?
I noticed the method reshape
, but apparently it does not really change the shape, as after I run forward()
, the shape will have be altered back to its original shape
Many thanks
Basically I am trying to see the relationship between the batch size and the running time
You need to reshape the input blob and then call reshape on the net, before calling forward.
I did the following
net.blobs['data'].reshape(BLOCK_SIZE, DIMENSION, HEIGHT, WIDTH) net.blobs['label'].reshape(BLOCK_SIZE, ) net.reshape()
The shape of every layer changes correctly. However, as soon as I run forward
, the shape will automatically change back :-(
This example may be helpful to you. See the code in forward_pass()
:
for chunk in [caffe_images[x:x+batch_size] for x in xrange(0, len(caffe_images), batch_size)]:
new_shape = (len(chunk),) + tuple(dims)
if net.blobs['data'].data.shape != new_shape:
net.blobs['data'].reshape(*new_shape)
for index, image in enumerate(chunk):
image_data = transformer.preprocess('data', image)
net.blobs['data'].data[index] = image_data
output = net.forward()[net.outputs[-1]]
Somehow I cannot manage to have it work. I am not trying to classify here. So I used the method suggested above to change the batchSize to 1000. However, the running time of forward does not even slightly alter, (i.e. when I modify the value in prototxt file the running time will be longer). Therefore I think the above method does not work
Please discuss usage on caffe-users; I've answered in the thread Regarding Change BatchSize in python.
@chwangaa Did you get the answer?
Basically I am trying to see the relationship between the batch size and the running time