long2ice/fastapi-cache

Dogpile race solving / cache key lock & wait, maybe with aiocache?

antont opened this issue · 1 comments

antont commented

Hi, has anyone done or thought of solving 'dogpiling' somehow with fastapi-cache? The dogpile.cache lib describes the problem nicely, https://dogpilecache.sqlalchemy.org/en/latest/

Briefly, in the case of an empty initial cache, it's when multiple requests come for the same resource, so that processing one is started, but not complete yet. So it's not in the cache, and new requests start fetching the same resource in parallel. It would be better for the later requests to wait for the completion of the first operation, and then return the cached result.

The dogpile.caching lib is not async, but the aiocache lib is, and it has an OptimisticLock mechanism that would support this use case: https://aiocache.aio-libs.org/en/latest/locking.html#aiocache.lock.OptimisticLock

Their tests include test_locking_dogpile, using RedLock, https://github.com/aio-libs/aiocache/blob/master/tests/acceptance/test_lock.py#L55

I'm thinking of trying that out in a fastapi-cache cache decorator, so am just curious if people have thought of this or have some solutions out there maybe.

antont commented

I added a simple initial AiocacheRedisBackend and am using it in our customized decorator. It works fine to solve dogpiling in our tests so far.

Basically, I just put the cache decorator code within:
async with RedLock(backend.cache, cache_key, lease=30):

And below is the code for the simple backend impl.

The locking could be also within the backend class, either in the default get() or some special get-with-lock method, but I did it this way (for) now (so the lock import in there is not used).

from typing import Optional
from fastapi_cache.backends import Backend
from aiocache.lock import RedLock #OptimisticLock, OptimisticLockError
from aiocache import Cache, RedisCache

class AiocacheRedisBackend(Backend):
    def __init__(self):
        self.cache: RedisCache = Cache(Cache.REDIS)

    async def get(self, key: str) -> Optional[str]:
        return await self.cache.get(key)

    async def set(self, key: str, value: str, expire: Optional[int] = None) -> None:
        return await self.cache.set(key, value) #, ex=expire)

    async def clear(self, namespace: Optional[str] = None, key: Optional[str] = None) -> int:
        await self.cache.clear(namespace=namespace)