hantman-lab/animal-soup

cache

clewis7 opened this issue · 1 comments

similar to mescore cache, but store recently used models based on params passed to infer() for a dataframe so it does not have to constantly be reloaded to do inference...would then only be updating the normalization for each video

Due to the issues with mesmerize-core cache on windows, what about this cache design instead:

  • A global dict that stores the models as something like {"path": <model object>}
  • if cache[desired-path] is None, set value in the dict
  • since this is only going to be used in infer(), just implement it directly in that extension function instead of making a decorator.