soroushmehr/sampleRNN_ICLR2017

Arbitrary integer index to training, validation and test arrays?

vonpost opened this issue · 0 comments

In /datasets/music/_2npy.py on line 33, 34, 35 we create the numpy files to be used as datasets and we specify the length of each array to be as such:

np.save('music_train.npy', np_arr[:-2*256])
np.save('music_valid.npy', np_arr[-2*256:-256])
np.save('music_test.npy', np_arr[-256:])

The problem is that when trying a different dataset that yields an array with a length shorter than 512 we're going to create an empty array in the music_train.npy field.

In the paper it is suggested we use a partition like 88:6:6 for the three sets-- couldn't we do this with something like numpy.split in order to ensure that no matter the size of the array we will still get the correct partition? Or am I missing something that requires it to be hardcoded like above?