Query about speed of inference on semantic segmentation.
Opened this issue · 3 comments
zwbx commented
Hi, thanks for your nick work.
I find it is a little bit slow to inference on semantic segmentation code, the speed is less than 1 task/s on 1 v100.
Is this a normal case? or there are anything to do?
Thanks:-)
daeunni commented
Same here
daeunni commented
@zwbx Hi, did you figure out what is the most reason?
I think this is because CoTTA uses augmentation processes like [0.5, 1.0, 1.5, ..] when doing each image's inferences.
zwbx commented
yes, there is nothing wrong.
…________________________________
发件人: Daeun Lee ***@***.***>
发送时间: 2023年11月26日 3:07
收件人: qinenergy/cotta ***@***.***>
抄送: Wenbo Zhang ***@***.***>; Mention ***@***.***>
主题: Re: [qinenergy/cotta] Query about speed of inference on semantic segmentation. (Issue #19)
<https://github.com/zwbx>
CAUTION: External email. Only click on links or open attachments from trusted senders.
________________________________
@zwbx Hi, did you figure out what is the most reason?
I think this is because CoTTA uses augmentation processes like [0.5, 1.0, 1.5, ..] when doing each image's inferences.
—
Reply to this email directly, view it on GitHub<#19 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AJFDSL4IFYBPJQ5LVN7AMCLYGINGHAVCNFSM6AAAAAAV4YGJ6WVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRWGM3TEOBUGQ>.
You are receiving this because you were mentioned.Message ID: ***@***.***>