ktaebum/AttentionedDeepPaint

Error in training (num_samples = 0

Opened this issue · 1 comments

Traceback (most recent call last):
File "train.py", line 77, in
main(parser.parse_args())
File "train.py", line 43, in main
batch_size=args.batch_size,
File "G:\PYTHON\python3.5.4\lib\site-packages\torch\utils\data\dataloader.py", line 176, in init
sampler = RandomSampler(dataset)
File "G:\PYTHON\python3.5.4\lib\site-packages\torch\utils\data\sampler.py", line 66, in init
"value, but got num_samples={}".format(self.num_samples))
ValueError: num_samples should be a positive integer value, but got num_samples=0

torch version ==1.1.0
OS: windows10

should i specify some value in train.py?

yes you must configure some command line argument
you can refer train.sh