jamesplease/bestfetch

Caching

jamesplease opened this issue · 2 comments

At this point in time, I'm thinking I won't add response caching to this lib, but what would it look like if I did?

  1. new dedupe option, fetchPolicy. This determines how it interacts with the cache
  2. it would somehow need to communicate that it is a cached response that is being returned. I don't think Promises are powerful enough to support that without modifying the response. I guess I could do res.fromCache = true. That's not much worse than the res.data that I'm already doing.

Update 1: This approach works fine for requests that deal with 1 operation, but it doesn't work when a request can contain multiple operations, like GraphQL queries. I could make a system that supports multiple operations, and it would be a superset of functionality of the system described above, but that would certainly be a weird additional to a "thin wrapper around fetch."


Update 2: The above is correct in two regards:

  1. the described system of caching does not support GraphQL
  2. it doesn't feel as much like a thin wrapper around fetch

However, the current request dedupe also isn't sophisticated enough for GraphQL (or really, any system where one request can contain multiple operations). This library simply will never be powerful for any sophisticated handling of GraphQL (or, say, a JSON API bulk operations extension).

So this library could have caching in there as well. There should just be a separate lib for bulk ops fetch features.

u-u-uu-uu-uupdate:

I'm going to add "basic" caching to this lib at the basic level, and then evaluate the code changes necessary to support an array alternative. If it's too much I'll make a separate lib

Resolved in #28. Will be released as 3.0