sorenbouma/keras-oneshot

training own dataset

Opened this issue · 1 comments

I have no issue working on load_data.py under omniglot dataset, and was able to run SiameseNet.ipynb.
However, Its only when I switch to my own dataset, I noticed you used np.stack, which errors out when my number of images in each folder is different from the next folder. Is this a requirement for Siamese to work properly? How can I get pass this?

  File "load_data.py", line 64, in <module>
    X,y,c=loadimgs(train_folder)
  File "load_data.py", line 61, in loadimgs
    X = np.stack(X)
  File "C:\Users\Noel Tam\AppData\Local\conda\conda\envs\py36\lib\site-packages\numpy\core\shape_base.py", line 353, in stack
    raise ValueError('all input arrays must have the same shape')
ValueError: all input arrays must have the same shape

I have no issue working on load_data.py under omniglot dataset, and was able to run SiameseNet.ipynb.
However, Its only when I switch to my own dataset, I noticed you used np.stack, which errors out when my number of images in each folder is different from the next folder. Is this a requirement for Siamese to work properly? How can I get pass this?

  File "load_data.py", line 64, in <module>
    X,y,c=loadimgs(train_folder)
  File "load_data.py", line 61, in loadimgs
    X = np.stack(X)
  File "C:\Users\Noel Tam\AppData\Local\conda\conda\envs\py36\lib\site-packages\numpy\core\shape_base.py", line 353, in stack
    raise ValueError('all input arrays must have the same shape')
ValueError: all input arrays must have the same shape

load_data.py line 48:
image = np.array(Image.fromarray(imageio.imread(image_path)).resize([300, 300]))
I used 300 pixel by pixel for my image, the default is 105 by 105 in .ipynb file, and also for a version problem in scipy, I used pillow(PIL) to read images.