tkem/cachetools

Consider adding async cache decorators

Closed this issue ยท 8 comments

tkem commented

See e.g. #192, #172, #137, #92.

For starters, there's https://github.com/tkem/cachetools/tree/wip/async, and thankfully @hephex has created https://github.com/hephex/asyncache (#112).

However, since cachetools now only supports Python >= 3.5, maybe it's time to integrate this.

khorn commented

๐Ÿ‘

Since I wasn't satisfied with other libraries APIs and was already successfully using cachetools for the sync part of my codebase, I implemented a solution to this ticket in the linked branch.

It takes the approach to have one set of decorators (@cached and @cachedmethod) that work with both def and async def instead of increasing the library's public API surface.

Let me know if an approach with different decorators would be preferred (I'm happy to contribute the code for that as well).

tkem commented

Sorry for the late response, please see my comments in #234!

Hey hello
If I understood your answer correctly in the proposed (and abandoned) PR just before, the only problem with the proposed solution was to separate synchronous and asynchronous decorators to avoid ambiguity?
If that's the case, wouldn't it be enough to take the code from chrisglass, separate the decorators (which should be easy since the code has already been split by an if inspect.iscoroutinefunction(func)) and abandon the pytest tests?

I was very disappointed to see that this lib doesn't support asynchronous yet so I'm willing to spend a few hours on a PR if needed, but I wouldn't want to go in the wrong direction from the start :)

There are multiple Cache implementations using Redis. Simply inspecting the decorated function itself could potentially insert a blocking operation in the middle of the async code.

In this case, the current cache[k] = v would connect and send data to Redis.

For full async support, the Cache itself should support async operations.

await cache.set(k, v)

Hi all. I did a bit of a research on available caching solutions in the Python ecosystem and cachetools seems to have the widest support I found. The only missing bit is the async support. I would be OK with using asyncache, but that one doesn't implement the async cachedmethod. And if I were to consider adding it, I'd much rather wrote a PR for a proper async support in cachetools instead.

With the increasing popularity of FastAPI, which makes async dead easy to use, I think the need for an async caching will only increase over time. There seem to be several people keen to add the support. So may I ask you @tkem what are the constrains to have an implementation approved?

From what I gathered:

  • No additional testing dependency
  • Explicit async package or explicit async decorators (to prevent bike-shedding, could you please make an executive decision and tell us which one and what the name should be? I care more about the support than the name TBH)

Did I miss anything?

tkem commented

@radeklat: Thanks for your interest in contributing to cachetools! Due to time and resource constraints, I finally decided I have to concentrate on maintaining cachetools "core functionality". This means that anything that could be implemented in a separate package, especially new decorators and cache implementations, will not be added to cachetools anytime soon.

So I'd suggest for improved async support you either try and team up with @hephex to work on https://github.com/hephex/asyncache, or, if that doesn't work out for some reason, start your own async cache project.

Please note that I added a Related Projects section to the README, so if you start some project based on or extending cachetools, please drop me a line or a PR so I can add this, too.

Thank you @tkem for a quick response and the clarification. That works for me ๐Ÿ‘