dptech-corp/deepmd-pytorch

Automated batch size in inference

Closed this issue · 1 comments

Currently the batch size used in the inference is 1, which makes the inference very inefficient.

One may utilize the AutoBatchSize in deepmd-kit to provide an automated batch size selection to make the inference more efficient