Request-level caching
smiley opened this issue · 10 comments
As more and more issues in making this object-oriented become apparent, a suggested solution would be to use caching on actual API calls: If the exact same call is done before a specified time has passed, a cached copy of the API call should be given instead.
Certainly better than Singletons and two-way binding things...
Hi, Sorry, just started looking at this project. What about using: https://github.com/mailgun/expiringdict for caching the requests?
That's actually what I'm currently using for caching object properties. What I proposed on this issue is to move caching from properties to requests, so I could cache the raw data used by many rather than every object's finished, parsed product.
I would really like this. :) Or is there a workaround for it to simulate this behavior?
What about using something like https://github.com/reclosedev/requests-cache ?
@zoidbergwill I'll have a look at it. Thanks!
I'll happily make a PR with it. if you like?
You can, but from the looks of it I'll have to rework the storage of requests-cache
into something memory-based, since it uses SQLite. (And this caching will be temporary, so something as big as a persistent database is too heavy)
Yeah, there are easier ways to do it, without the library. If you wanna do in-memory caching.
That makes sense since it doesn't really need to store it between instances.
sqlite should be able to run in memory (filename = ":memory:"), unfortunately they seem to be forcing any name entered to become "XXX.sqlite", (even though the docstring for DbDict says not to)
self.keys_map = DbDict(location + extension, 'urls')
regardless, @smiley they do have a in-memory cache, the base one, so that might work for your needs
https://github.com/reclosedev/requests-cache/blob/master/requests_cache/backends/base.py
and should be accessible via requests_cache.install_cache('',backend='base')
.
Can I help with this in any way? Testing for example?