cache
clewis7 opened this issue · 1 comments
clewis7 commented
similar to mescore cache, but store recently used models based on params passed to infer()
for a dataframe so it does not have to constantly be reloaded to do inference...would then only be updating the normalization for each video
kushalkolar commented
Due to the issues with mesmerize-core cache on windows, what about this cache design instead:
- A global
dict
that stores the models as something like{"path": <model object>}
- if
cache[desired-path]
isNone
, set value in thedict
- since this is only going to be used in
infer()
, just implement it directly in that extension function instead of making a decorator.