Couldn't obtain all predicted values when using trained models
XuanrZhang opened this issue · 3 comments
Hi @kellycochran,
When I follow the 0_generate_predictions_for_other_notebooks.ipynb script to get predicted values by using trained models.
I couldn't receive all predicted values, only multiples of the batch_size. Are there any changes I can make to obtain all predicted values?
For example, I got 7009 sequences in total, I just got 7000 predicted values as output(with batch_size 1000). how can I store the last 9 values together?
Thanks in advance.
Best,
Xuan
Hi Xuan,
One way to do this would be to set the batch_size to 1, but at the cost of efficiency. You could also modify the code inside ValGenerator
so that it loads the full dataset, including any partial batches:
- Modify the line
self.steps = lines_in_file // batchsize
insideget_steps(..)
to beself.steps = math.ceil(lines_in_file / batch_size)
, and import math elsewhere - Modify the expression
(batch_index + 1) * self.batchsize
inside__getitem__(..)
to instead bemin(len(self.coords), (batch_index + 1) * self.batchsize)
. This way, there will not be an error if the final batch is size < batch_size.
Hope that helps!
Kelly
Hi @kellycochran,
Thanks a lot for your reply! I will try to modify ValGenerator.
One more question here, if I set the batch_size as 1 to make a prediction, does that change predicted values?
Best,
Xuan
A batch_size of 1 will not change the predicted values.