In browser inference support
Closed this issue · 1 comments
AmitMY commented
Thanks for this toolkit,
Is there any setting for which one would be able to create a model file that is executable inside the browser?
(for Tensorflow repositories, this would be Keras Layers
or tflite
models, for example)
salvacarrion commented
This library handles the preprocessing, training, and evaluation for you, without imposing any specific architecture. Therefore, you should be able to use the model of your choice such as a "lite-model4browser" (as long as it is a PyTorch one).
Ps. Keep in mind that it is in a very early stage of development (1-month-old)