About the quantization of recurrent neural network!
Closed this issue · 2 comments
larenzhang commented
It is a nice work to benchmark quantization methods for various CNN architectures. Recurrent neural network is another mainstrain architecture, which is widly used as a times series model in edge-device. Quantization can also be used to recurrent neural network, such as LSTM, GRU and etc. I am curious that is there any plan to benchmark the quantization of recurrent neural network.
Look forward to your response.
Best wishes!
wimh966 commented
Hi, we have not planned to support RNNs. But we plan to support transformer models, which are also popular in NLP field.
github-actions commented
This issue has not received any updates in 120 days. Please reply to this issue if this still unresolved!