Cache processing results
asiniy opened this issue · 3 comments
Now all processed files (especially images) are generated on each query.
It's not difficult if you have only couple of images to be rendered on the page. But this becomes a problem when you have a webpage which requires 20-30 images to be generated simultaneously. They are returned one after another, even on powerful VPN.
In my opinion, exfile should have a cache ability, like:
config :exfile, Exfile,
cache: %{
size: '10GB',
path: '/tmp/exfile-other' # default is `/tmp/exfile-cache-#{env}`
}
All processed images will be stored to path
and will have the mark of their earlier usage assigned, and there also will be a supervisor which monitors size of the path
folder. If it oversizes limit, it will destroy the oldest unused images in it.
@keichan34 WDYT?
Sorry about the late reply on this.
Since files are fingerprinted and processing inputs are in the URL, the requests can be cached by a normal HTTP cache (nginx and/or any pull-based CDN like CloudFront, CloudFlare, Fastly, etc). I don't think it's in the scope of Exfile to cache outputs when a HTTP cache can do it equally as well.
@keichan34 interesting idea! Let me test it, I'll response in ~ 1 week
@asiniy you could also put nginx in front of elixir app and cache exfile responses with it.