cpu inference
AK391 opened this issue · 2 comments
thanks for releasing code, is inference on cpu going to be very slow/not possible?
Hi @AK391, thanks for your interest. CPU inference is possible but it is much slower than with GPU.
The installation for CPU usage is the same though. If tensorflow-gpu
does not detect a GPU, it will use the CPU.
I just installed it locally and started the optimization process on CPU. I will report the duration here once it has finished.
We also provide a Colab, where you can use the GPU. Same for the drawing app.
It took close to 3 hours on CPU (Intel(R) Xeon(R) CPU E5-2630) to run this example stylization. The example uses resolution 1024. Using 512 will likely cut the duration in half. But we really recommend to use GPU. Hope that helps!