long2ice/fastapi-cache

when request is present, cache is disabled

schwannden opened this issue · 2 comments

Due to this line:

When request is present, cache is disabled. Why do we do so? Isn't it quite normal if user want to customize key builder that takes in request object?

I can create a pr to fix this but just wondering why the design is so in the first place.

because @cache can be apply for general function or method, like this:

image

when request is present, it will run there

await backend.set(cache_key, coder.encode(ret), expire or FastAPICache.get_expire())

Pretty sure this was never an issue, but perhaps the refactoring work done in #139 makes it easier to understand when caching is applied and when not.

The request and response objects remain optional, meaning the original methods can be called directly without having to pass in a request or response object unless those methods specifically need them.

This is what the decorator does; wherever these steps reference the request or response, those parts are skipped if no request or response is present:

  • if caching is explicitly disabled (when FastAPI.get_enabled() returns False, or the request has a Cache-Control header with the value no-store or no-cache, or the request method is not GET), then the method is called directly and the result is returned.
  • if there was no cached value, the method is called and the result stored in the backend. Headers are added to the response to provide information that a browser can use to help avoid even sending a response body when the data is still cached later on.
  • otherwise, the cached data is used to produce the returned value. If the browser sent a If-None-Match header with a value that matches the cached result ETag, a 304 Not Modified response is returned with no body, saving bandwidth.

Specifically, this is what this looks like:

copy_kwargs = kwargs.copy()
request: Optional[Request] = copy_kwargs.pop(request_param.name, None) # type: ignore[assignment]
response: Optional[Response] = copy_kwargs.pop(response_param.name, None) # type: ignore[assignment]
if _uncacheable(request):
return await ensure_async_func(*args, **kwargs)
prefix = FastAPICache.get_prefix()
coder = coder or FastAPICache.get_coder()
expire = expire or FastAPICache.get_expire()
key_builder = key_builder or FastAPICache.get_key_builder()
backend = FastAPICache.get_backend()
cache_status_header = FastAPICache.get_cache_status_header()
cache_key = key_builder(
func,
f"{prefix}:{namespace}",
request=request,
response=response,
args=args,
kwargs=copy_kwargs,
)
if isawaitable(cache_key):
cache_key = await cache_key
assert isinstance(cache_key, str)
try:
ttl, cached = await backend.get_with_ttl(cache_key)
except Exception:
logger.warning(
f"Error retrieving cache key '{cache_key}' from backend:", exc_info=True
)
ttl, cached = 0, None
if cached is None: # cache miss
result = await ensure_async_func(*args, **kwargs)
to_cache = coder.encode(result)
try:
await backend.set(cache_key, to_cache, expire)
except Exception:
logger.warning(
f"Error setting cache key '{cache_key}' in backend:", exc_info=True
)
if response:
response.headers.update(
{
"Cache-Control": f"max-age={expire}",
"ETag": f"W/{hash(to_cache)}",
cache_status_header: "MISS",
}
)
else: # cache hit
if response:
etag = f"W/{hash(cached)}"
response.headers.update(
{
"Cache-Control": f"max-age={ttl}",
"ETag": etag,
cache_status_header: "HIT",
}
)
if_none_match = request and request.headers.get("if-none-match")
if if_none_match == etag:
response.status_code = HTTP_304_NOT_MODIFIED
return response
result = cast(R, coder.decode_as_type(cached, type_=return_type))
return result