Automated batch size in inference
Closed this issue · 1 comments
wanghan-iapcm commented
Currently the batch size used in the inference is 1, which makes the inference very inefficient.
One may utilize the AutoBatchSize
in deepmd-kit to provide an automated batch size selection to make the inference more efficient
njzjz commented
See also PyTorch implementation