tf.loadGraphModel not caching model to /tmp after downloading
Closed this issue · 0 comments
ntedgi commented
at EfficientNetModel add cache layer for models already downloaded
https://js.tensorflow.org/api/latest/#loadGraphModel
each time you loading the model we download it from remote
model sizes are from 26-170 MB
it will be better if we can cache the model download to /tmp dir
then load them using file uri