stackshareio/graphql-cache

Add means of forcing through request

thebadmonkeydev opened this issue · 1 comments

I'm curious if there's a way we can add a means to force cache misses with a request form the front end. For instance, an object like a profile with a collection of objects...it is likely someone would want to cache that collection when reading for most cases, but would want a real-time result when adding these objects to the resource.

It looks like we may be able to support some kind of special context value like context[:force_cache] = true and skip cache all together when that value is set.

Transcript of a conversation had between @jeromedalbert and myself about this topic:

With the stuff for company profiles, we needed to nuke caching on `social_posts`, but technically only in the case of profile edits...everywhere else should probably be cached (most reads at least).  So I'm pulling it until I can get around to https://github.com/Leanstack/graphql-cache/issues/28

[...]

jerome [5:37 PM]
so you want something like `field :my_field, cache: true, invalidate_when: :modified`? (edited)

michael [5:38 PM]
not sure how possible that is because a mutation would be a totally different context than a query...I was thinking adding support for something like this:
```StackshareSchema.execute("{ foo { id } }", context: { force_reload: true })```
that way it can be triggered off a param or header in the GraphQL controller

jerome [5:38 PM]
ok

michael [5:39 PM]
allow the frontend to tell me when they want fresh data, vs. trying to figure it out on the backend

jerome [5:39 PM]
so that would force reload only the fields present in that query? (edited)
not all the fields of the queried objet

michael [5:40 PM]
it would be any fields for that request

jerome [5:40 PM]
ok

michael [5:41 PM]
I'm thinking that there's also options for using a version based caching scheme where the key is based off an updated time field from the parent object
still a young solution for sure
that does assume AR for the most part, but not to hard to allow for using a custom method to produce a hash or something

jerome [5:44 PM]
http://guides.rubyonrails.org/caching_with_rails.html#low-level-caching
> Notice that in this example we used the cache_key method, so the resulting cache key will be something like products/233-20140225082222765838000/competing_price. cache_key generates a string based on the model’s id and updated_at attributes.
that sounds interesting
auto invalidation when updated

michael [5:45 PM]
that essentially what I want to emulate/integrate with

jerome [5:45 PM]
`field :my_field, cache: { key: 'your cache key here, maybe same syntax as Rails cache_key, maybe we can allow the value #{cache_key} for Rails apps' }` (edited)

michael [5:46 PM]
I've technically got a key prefix option https://github.com/Leanstack/graphql-cache#usage that could be used like that

jerome [5:47 PM]
cool I think that would be nice to have it based on updated at stuff. Not sure how that would work for collections

michael [5:47 PM]
I also have to update SS to the newest version of the gem...none of those options on the field definition exist in the version we have, just `cache: true`

jerome [5:47 PM]
but this way no need to send a special request with context “fore reload please”

michael [5:47 PM]
yeah that's true
collections being changed can "touch" their parent in AR so that the updated at is modified when the collection is...but that couples us a little tighter to AR

jerome [5:48 PM]
yes
(but that works :P)

michael [5:48 PM]
but maybe we don't care about that and just assume the user deals with it
there is a snag in that, kind of...the field def is parsed at load, whereas the value we need will only be available on resolution...We could pass a proc:
```field :my_field, cache: { key: Proc.new { object.cache_key } }```
and just resolve the proc during resolution
I like something like this too:
```field :my_field, cache: { key_method: :cache_key }```

jerome [5:53 PM]
yes

michael [5:53 PM]
and `send` the key method during resultion to the object

jerome [5:53 PM]
yeah, we can make the syntax sexy and terse enough with things like that

michael [5:54 PM]
I'm not a huge fan of procs, at least explicit ones

jerome [5:54 PM]
`field :my_field, cache: { key: -> { object.cache_key } }`  :troll: (edited)

michael [5:54 PM]
looks like javascript :rolling_on_the_floor_laughing:

jerome [5:54 PM]
“What are your thoughts on ES6 REST/spread operator?”
leaves channel
`field :my_field, cache: { key: :cache_key }`
if the value of `key` is a symbol or a string, maybe send it to obj at runtime, otherwise if proc/lambda execute it (edited)
or maybe allow strings
`field :my_field, cache: { key: "#{cache_key}/competing_price" }` just like Rails
anyways, I like a solution along those lines better, it definitely looks possible (although I have no idea about the internals of graphql-ruby right now)