ValueError: all the input arrays must have same number of dimensions, but the array at index 0 has 2 dimension(s) and the array at index 1 has 1 dimension(s)
nalrasheed opened this issue · 2 comments
Hello,
Thank you for your prompt response.
can you please help me on this error?
cnn size:5710784
ctc head size:5710784
enc size:2497024
cdec size:1604636
cenc size:2504192
Training:
binarize: False
/users/noufcs/.local/lib/python3.6/site-packages/torch/nn/functional.py:718: UserWarning: Named tensors and all their associated APIs are an experimental feature and subject to change. Please do not use them for anything important until they are released as stable. (Triggered internally at /pytorch/c10/core/TensorImpl.h:1156.)
return torch.max_pool2d(input, kernel_size, stride, padding, dilation, ceil_mode)
Testing at epoch 5
Traceback (most recent call last):
File "train_words.py", line 542, in <module>
evaluate_setting(setting, dataset, gpu_id)
File "train_words.py", line 451, in evaluate_setting
test_funcs.test(epoch, test_loader, models, setting_dict, gpu_id, logger)
File "/mydata/exp/Seq2Emb/test_funcs.py", line 41, in test
tdecs = np.concatenate(tdecs)
File "<__array_function__ internals>", line 6, in concatenate
ValueError: all the input arrays must have same number of dimensions, but the array at index 0 has 2 dimension(s) and the array at index 1 has 1 dimension(s)
From the error, I am just guessing that dataloader is providing images and text to the network. Here Check whether all image crops have the same size. If they are different try to bring it in the same size either by resizing in data loader or manually resizing and saving all image crops on a drive.
Sorry for the late response. The corresponding concatenation should work without a problem for normal batches. Nonetheless, the "guilty" operation is squeeze of line 38 (tdec = o.argmax(2).permute(1, 0).cpu().numpy().squeeze()). If batch size is greater than 1, this should work perfectly. But if a batch has only one image (and I suppose this is the case), a problem occurs. To this end you can substitute the squeeze function with reshape(img.size(0), -1).
Thank you for bringing this case to my attention.