BUG: Overlap breaks predictions on trained networks
mvonpohle opened this issue · 3 comments
When predicting single band prediction or multi-band predictions with the --overlap
option the output tiff results in a grid of nodata or 1 values.
Example command:
classify --prob --config cloud_shadow.yaml band_reduced_cloud_shadow_model_bandorder2.SavedModel --outprefix bandorder2_scaleoroffset2_overlap128_withprob_ --overlap 128
Tested with networks that output 3 bands and 1 band
predict.py - Predictor._predict_array:
- When the tile size is set at 256 and overlap is 10 the data size being passed is (246,246,8)
- the returned value retval is all 0's regardless of the data passed to _predict_array
- this is because the chunks fed to _model.predict_on_batch are blank
- this is because the chunks extracted by tf.image.extract_patches are blank because extract_patches is trying to extract patches larger than the image itself.
image.shape
=(1,246,246,8)
tf.image.extract_patches(image, [1, net_input_shape[0], net_input_shape[1], 1], [1, net_output_shape[0], net_output_shape[1], 1], [1, 1, 1, 1], padding='VALID')
-->tf.image.extract_patches(image, ([1, 256, 256, 1], [1, 256, 256, 1], [1, 1, 1, 1]), padding='VALID')
chunks.shape
=(0,256,256,8)
these are empty chunks- Because padding='VALID' no patches are extracted per tensorflow documentation: "If VALID, only patches which are fully contained in the input image are included. "
- this is because the chunks extracted by tf.image.extract_patches are blank because extract_patches is trying to extract patches larger than the image itself.
- this is because the chunks fed to _model.predict_on_batch are blank
Is the issue that the data being passed is (246,246,8) or that extract_patches is trying to extract too large (256,256) a patch? @bcoltin can you help clarify?
Hm, regardless of the tile size, if the network input size is fixed to 256x256, predict_array should get a 256x256 patch.
I think the fix may be, in predict.py line 248,
input_bounds.make_tile_rois_yx((block_size_y - offset_r, block_size_x - offset_c)
to
input_bounds.make_tile_rois_yx((block_size_y, block_size_x))
If you could also add an assert to predict_array, for when net_input_shape is not none, to check that net_input_shape matches the shape of data, I think that would be a good idea.