thu-coai/DA-Transformer

can the output model be transform to onnx format?

Opened this issue · 3 comments

dlkht commented

as title

Not supported yet.
I have a plan to achieve quick and effective deployment but it is not my highest priority (I'm still pursuing higher quality). PR is welcome.

dlkht commented

thanks for your answer.
and can it be run in android platform? (inference model)

@dlkht I am afraid it is not in my future plan.

I want to mention that NAT is mainly designed to utilize the parallel computing ability of GPU, so it may not bring such a big speedup on CPU/mobile device. (See Table1 in https://arxiv.org/pdf/2205.10577.pdf)