cattendu opened this issue 3 years ago · 1 comments
Hi, this is awesome work! Are there any plans to complete the implementation to allow the inference to run on CPU only?
Hi @cattendu,
Thank you for your interest in our work. As of now, we don't have any plans for supporting CPU-only inference. However, any pull request adding an additional feature is welcome.
Thank you