When I used the Window system to train the code, the code reported an error:'>' not supported between instances of 'NoneType' and 'int'
WhiteJiang opened this issue · 8 comments
Traceback (most recent call last):
File "D:/SiamMask/tools/train_siammask.py", line 296, in
main()
File "D:/SiamMask/tools/train_siammask.py", line 170, in main
train(train_loader, dist_model, optimizer, lr_scheduler, args.start_epoch, cfg)
File "D:/SiamMask/tools/train_siammask.py", line 188, in train
for iter, input in enumerate(train_loader):
File "D:\Anaconda\envs\mask\lib\site-packages\torch\utils\data\dataloader.py", line 314, in next
batch = self.collate_fn([self.dataset[i] for i in indices])
File "D:\Anaconda\envs\mask\lib\site-packages\torch\utils\data\dataloader.py", line 314, in
batch = self.collate_fn([self.dataset[i] for i in indices])
File "D:\SiamMask\datasets\siam_mask_dataset.py", line 551, in getitem
search_mask = (cv2.imread(search[2], 0) > 0).astype(np.float32)
TypeError: '>' not supported between instances of 'NoneType' and 'int'
I have the same problem. Have you solved it?
I have the same problem. Have you solved it?
I did not have this error when I run it on the linxu system, maybe you should use the linxu system to run it
Thanks you for answer,but i am using Ubuntu and this problem still occurs.
i have the same problem
have anyone found a solution??
search_mask = (cv2.imread(search[2], 0) > 0).astype(np.float32)
TypeError: '>' not supported between instances of 'NoneType' and 'int'
I have the same problem
TypeError: '>' not supported between instances of 'NoneType' and 'int'
I have the same problem
TypeError: '>' not supported between instances of 'NoneType' and 'int'
It should be a data set problem, and it should be reprocessed according to the data set processing method in SiamMask
I have the same problem
TypeError: '>' not supported between instances of 'NoneType' and 'int'It should be a data set problem, and it should be reprocessed according to the data set processing method in SiamMask
Thanks ,I crop the data set again and this problem disappeared .