sercant/mobile-segmentation

Errors with visualization

thepuremind opened this issue · 3 comments

First of all I thank you for good job.

When the file is started visualize.py the following error occurs:

File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/client/session.py", line 1407, in _call_tf_sessionrun
run_metadata)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Shape mismatch in tuple component 1. Expected [769,769,3], got [1024,2048,3]
[[{{node batch/padding_fifo_queue_enqueue}}]]

I tried to resize, but it did not help. When I change the size to [1024,2048] the following error appears.
However, the first image is successfully processed.

Traceback (most recent call last):
File "visualize.py", line 335, in
tf.app.run()
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/platform/app.py", line 125, in run
_sys.exit(main(argv))
File "visualize.py", line 320, in main
train_id_to_eval_id=train_id_to_eval_id)
File "visualize.py", line 183, in _process_batch
colormap_type=FLAGS.colormap_type)
File "/code/ShuffleNetV2/utils/save_annotation.py", line 33, in save_annotation
label, colormap_type)
File "/code/ShuffleNetV2/utils/get_dataset_colormap.py", line 389, in label_to_color_image
raise ValueError('label value too large.')
ValueError: label value too large.

Hello @thepuremind,

Can you try the following setting? It is better to use 1025, 2049 as the model is much more accurate on odd dimensions. Also colormap_type should be set to cityscapes. I will be fixing the default values in the script.

python visualize.py \
    --model_variant=shufflenet_v2 \
    --output_stride=16 \
    --vis_logdir=./vis_dir \
    --checkpoint_dir=./your_checkpoint_dir \
    --dataset=cityscapes \
    --dataset_dir=./your_dataset_dir \
    --colormap_type=cityscapes \
    --vis_crop_size=1025 \
    --vis_crop_size=2049

Hello @thepuremind,

There was a bug on the ground truth visualization for the ignore labels. Thank you for pointing out. 1ca269e should fix the error.

@sercant thanx!

Your comment helped solve this problem.