Setting number of training samples
xi-xi-xi-xi opened this issue · 4 comments
Can we set the training sample ( such as 500samples each class) in this code?
do the editors of the models give the code for you to refer? if you have , can you share with us? thanks
Yes, if you put an integer after --training_sample
, the toolbox will use this number of samples, e.g. --training_sample 500
.
Regarding your second question, no, authors of the models did not share the original code with us. All models here a good faith reimplementation with the hyperparameters from the original papers (when provided).
After I put an integer after --training_sample , it stayed at the following statement for a long time after execution :
[Errno 99] Cannot assign requested address
on_close() takes 1 positional argument but 3 were given