facebook/relay

[meta] Support real-time updates via event subscriptions

josephsavona opened this issue ยท 74 comments

Realtime data in GraphQL is something that we and the community are actively exploring. There are many ways to achieve "realtime" or near-realtime updates: polling, "live" queries, or event-based approaches (more on these tradeoffs on the GraphQL blog). Furthermore, there are a variety of transport mechanisms to choose from depending on the platform: web sockets, MQTT, etc.

Rather than support any one approach directly, we would prefer to allow developers to implement any of these approaches. Therefore, we don't plan to create a RelayMutation-style API for subscriptions. Instead we're working create a "write" API that will make it easy for developers to tell Relay about new data (along the lines of store.write(query, data)). See #559 for more information.

For now, we recommend checking out @edvinerikson's relay-subscriptions module.

skevy commented

Just fwiw, I've actually started implementing this. I'm only so far into it - but I just wanted to put here that I'm actively working on it.

@skevy I'm not sure why, but I can't assign this to your directly - but thanks for the heads up, looking forward to this!

skevy commented

@josephsavona it's cuz I'm not a collaborator. Silly Github.

Is anyone at FB working on this for OSS release? Or only for internal use

Aha! No one is actively working on the OSS Subscriptions (the API described above) - so you won't conflict :-)

The "subscribe/dispose" API seems awfully imperative and fiddly to control... what about something like this where RelayContainers define subscriptions declaratively:

module.exports = Relay.createContainer(Story, {
  fragments: {
    story: () => Relay.QL`
      fragment on Story {
        text,
        likeCount
      }
    `,
  },
  subscriptions: (props) => [
    new StoryLikeSubscription({storyId: props.storyId})
  ]
});

@dallonf That's a great idea. Ultimately there has to be an imperative API somewhere: in Relay, that's Relay.Store. We do need imperative APIs for opening subscriptions and disposing them, but the API you described would be a great way to provide declarative access to those methods from containers.

Note that the query tree is static and props are unavailable: instead the API should use variables:

  subscriptions: variables => [...]
skevy commented

@dallonf yah some type of declarative API to wrap it is smart. 100% agree.

@josephsavona Agreed, I would definitely need the subscribe/unsubscribe methods in some cases.

Although now I wonder if the RelayContainer is the right place for this... you'd still have to add this particular subscription to any component that requests story { likeCount }. And it kind of runs the risk of coincidence-driven-development where every component that uses likeCount will benefit from just one RelayContainer in the tree that defines a subscription - but remove that component, and suddenly your real time updates stop working!

I wonder if it would be possible to implement "live queries" on the client, so that whenever the current route contains likeCount, it automatically subscribes to every (opt-in) Subscription that could update likeCount... kind of like a fat query in reverse? This feels like one of those models which would reveal a lot of intractable edge cases as soon as you got in too deep to get back out ๐Ÿ˜› . It certainly works for the given example of `StoryLikeSubscription, but the whole point of the Subscription thing seems to be to provide more flexibility than live queries allow. What sort of use cases would break this model?

nodkz commented

@dallonf great idea. But I recommend use Hash for subscriptions like in fragments. In order to be able to have several subscriptions, and be able to get access to their state:

subscriptions: {
   toMasterNode: (variables) => new StoryLikeSubscription({storyId: variables.storyId}),
   toSlaveNode: (variables) => new StoryLikeSubscription2({storyId: variables.storyId}, {disabled: true}),
}

@josephsavona It would be great if we can get subscriptions status and ability to manipulate them in Component:

class Story extends React.Component {
  render() {
    return (
      <div>
         <p>{this.props.story.text}</p>
         <p>{this.props.story.likeCount}</p>
         { !this.relay.subscriptions.toMasterNode.connected ?
             <p>
                   {`Not connected (${this.relay.subscriptions.toMasterNode.state})`}
                   <button onClick={()=>this.relay.subscriptions.toMasterNode.connect()}>Enable</button>
             </p> :
             <p>
                   <button onClick={()=>this.relay.subscriptions.toMasterNode.disconnect()}>Disable</button>
             </p>
         }         
      </div>
    );
  }
}
F21 commented

What's the plan in terms of the network layer for this? I would imagine that for push notifications, we would need websocket or socket.io's fallback method of long-polling or flash.

Will we be able to have real-time notifications on a websocket connection and mutations and queries still happening over http?

@F21 Because there are multiple approaches to pushing data from server to client, we will likely leave the implementation of sendSubscription up to the user. This would allow developers to, for example, use HTTP for queries/mutations and some other method for establishing subscriptions.

This is per the description:

Provide a stub implementation in RelayDefaultNetworkLayer which throws when subscriptions are requested

It would be actually really cool if subscriptions had a fat query like mutations do :)

@pasviegas Good idea! Unfortunately it isn't quite so simple. Subscription queries may execute on the server at any time after the subscription is opened, which means that the "tracked" (active) queries on the client can be different between executions. While we could allow users to define a fat query for subscriptions, it would mean that the data fetched by the subscription would depend on what had been queried when the subscription was first created. In other words, it would create a non-deterministic query that would be difficult to reason about.

Requiring a static subscription query makes it clear exactly what data will be refreshed when a subscription event occurs.

+1

Once the API is down (perhaps in some pre-release form) I would love to figure out how to put this to use in the Meteor world. We have a client/server pubsub API that uses a custom Json format, DDP, to communicate LiveQuery updates to clients. There have been lots of scalability issues with LiveQuery and plans to move to a pre-write webserver layer event-based approach to push changes to clients, which my assumption is what Relay subscriptions would be all about.

The questions I'm interested in exploring are:

  • using Relay, would we even use our pubsub API anymore?
  • what about the DDP Json protocol?
  • or do we resolve just to the base websockets API Meteor uses?
  • exactly what that Meteor already offers would we reuse if anything?

As soon as I find the answers to these questions and have some base Relay subscriptions tools to use, I'd love to start implementing this for Meteor (along with something like Graffiti of course since Meteor uses Mongo). I think Meteor could be a great guinea pig given it's one of the longest standing most used solutions for the whole subscription + reactivity enchilada here. I personally don't even know of any other LiveQuery solutions besides rethinkdb which isn't in the same stages as meteor, which has been offering this as their bread and butter for approaching 4 years. Our community has a lot of developers that would be willing--rather, eager--to test this out. I also know Meteor Development Group (the company behind the framework) is seriously considering this route as well.

LiveQuery won't work for us anymore. GraphQL/Relay is really looking like the way forward to many in the Meteor community. Let me know what I can do.

@faceyspacey I would love to test your implementation.

@faceyspacey We would love to explore how to use Relay / GraphQL together with Meteor as well, Let me know if you end up going down this route further. @qimingfang fyi.

Anyone I can coordinate with on trying to help out on this?

I'm going to try and take a stab at this. I checked with @skevy and he might start next week so I'll try and communicate anything I do in case it is useful.

I spent this afternoon looking at Relay mutation code. I have a few questions:

  1. Is it desirable to keep a central reference to all subscriptions? There is the RelayMutationQueue which holds all mutations, but from what I can tell this is required for:
  • queueing collisions
  • optimistic updates get re-run multiple times (if I understand the code) so a reference is required to all mutations
  • RelayContainer can check if there are pending mutations on a record

There might be other reasons I'm missing / not understanding.

I'm not sure if subscriptions would require something similar. It could be useful for visibility and maybe some kind of mass dispose. Either way it would be easy to add them to a central map someplace if desired.

  1. For writeRelayUpdatePayload, again, just a quick glance, it looks like it can be re-used for subscriptions. This sounds reasonable / expected? The only issue I saw was handleRangeAdd has an invariant on clientMutationID. Could that invariant be removed and the code with RelayMutationTracker only be run when clientMutationID is in the payload?

I haven't looked at the functionality of RelayMutationTracker yet -- todo list for tomorrow -- so this might answer itself.

  1. Finally, I looked at RxJS to try and familiarize myself with the lingo. Here is rough pseudo-code for Relay#subscribe:
  subscribe(subscription, callbacks) {
    // the RelaySubscriptionObserver class does two things:
    // - add an onNext for calling writeRelayUpdatePayload
    // - enforce all the observer rules / laws / etc
    const observer = new RelaySubscriptionObserver(subscription, store /* or wuteva */, callbacks);
    const request = new RelaySubscriptionRequest(subscription, observer);

    // coerce the return value from RelayNetworkLayer#subscribe into a disposable e.g. Thing#dispose()
    const disposable = createDisposable(RelayNetworkLayer.sendSubscription(request));
    observer.setDisposable(disposable);
    return observer.getDisposable();
  }

code for RelayNetworkLayer#sendSubscription would return a function that performs unsubscribe / dispose:

  sendSubscription(request) {
    const id = 1; // placeholder ...

    const handler = response => {
      if (response.id === id) {
        if (response.data) {
          request.onNext(response.data);
        } else if (response.error) {
          request.onError(response.error);
        }
      }
    });

    socket.on(`graphql:subscription:${id}`, handler);

    // subscribe
    socket.emit('graphql:subscribe', {
      id,
      query: request.getQueryString(),
      variables: request.getVariables()
    });

    return () => {
      // unsubscribe
      socket.off('graphql:subscription', handler);
      socket.emit('graphql:unsubscribe', {id});
    };
  }

Seem reasonable?

Thanks!

I have an initial implementation of subscriptions and have questions / request of feedback from the Relay team if possible.

  1. Relay.Subscription / Relay.Mutation code duplication:

Right now I just duplicated code from Mutation to Subscription. This is mostly ok as its just a skeleton, but the function _resolveProps seems like logic that should be shared / not duplicated. Do you think this should be handled via extracting _resolveProps or making a base class or something? I know this is like a preference question, I just want to try and match accepted practices in Relay.

  1. clientSubscriptionId

I'm not sure why this is required.

  1. Query Building with MutationConfigs

With mutations the query is built from configs + fat query. With subscriptions it is provided. That said, the configs need to augment the query. For example, given a RANGE_ADD:

subscription {
  addTodoSubscribe(input: $input) {
    todoEdge {
      node { text complete }
    }
  }
}

The todoEdge field (edgeName) in the above query needs __typename and cursor added. They are added during the edge field creation for mutations. My assumption is that subscriptions should modify the provided query to make sure all required fields have been added. That would result in:

subscription {
  addTodoSubscribe(input: $input) {
    clientSubscriptionId
    todoEdge {
      __typename
      cursor
      node { text complete }
    }
  }
}

The logic I'm going with is:

  • clientSubscriptionId is added to everything
  • RANGE_ADD : add __typename to all edgeName fields (cursor is handled by Range.QL)
  • RANGE_DELETE / NODE_DELETE : add deletedIDFieldName to the call.

Is this reasonable?

  1. sanitizeRangeBehaviors

Should this be run against subscriptions as well? If so, would you suggest moving it from RelayMutationQuery into a different namespace as an export?

  1. breaking the PR up

A lot of this can be smaller PR's. I'm happy to break things up!


Thanks. I haven't really done much OSS work so I'm not really sure the workflow. I feel a bit blind ... just do stuff and submit a PR and see what happens? ;p

sorry for being chatty... but...

here is the commit with the work / comments in it: eyston@4405fa7

here is a stubbed implementation of a network layer: https://gist.github.com/eyston/ce723b38b1756cb5f81e

there are no tests atm, waiting on feedback on if this is sane or not ;p

thanks again~

@eyston Thanks for working on this! My apologies in the delay in our response, we're all back now after the holidays :-)

part 1

  1. Is it desirable to keep a central reference to all subscriptions?

Yes. There doesn't strictly have to be, but having a central reference to all subscriptions would allow for disposing of all subscriptions (e.g. with the Relay context was collected) or for de-duping subscriptions.

  1. For writeRelayUpdatePayload, again, just a quick glance, it looks like it can be re-used for subscriptions. This sounds reasonable / expected?

Yup! That method was intended for handling mutation and subscription payloads. I saw that you have a PR to update the one invariant that blocks this, let's continue discussion about it there.

  1. Finally, I looked at RxJS to try and familiarize myself with the lingo.

The example you gave looks about right - RelayStore.subscribe returns a {dispose}-able (user-facing disposable), as does RelayNetworkLayer#sendSubscription (internal disposable). When the user-facing disposable is called the framework calls the internal one to teardown the subscription on the server.

part 2

  1. Relay.Subscription / Relay.Mutation code duplication: Right now I just duplicated code from Mutation to Subscription. This is mostly ok as its just a skeleton, but the function _resolveProps seems like logic that should be shared / not duplicated.

I'm not aware of a use-case that would require Subscriptions to have their own fragments (and therefore an equivalent to resolveProps). How about we start without this feature and add it as necessary? This will keep things simple for now.

  1. clientSubscriptionId .. I'm not sure why this is required.

Yeah, seemingly this could be a network-layer concern and doesn't have to be part of the subscription query itself. Let me double check how it's used in our implementation.

  1. Query Building with MutationConfigs

Generally, subscriptions should be able to reuse all the utilities for constructing mutations queries. Let's discuss specific logic in the PR, but what you described sounds right.

  1. sanitizeRangeBehaviors

This exists only for legacy reasons - to warn users that the format has changed - so we can skip calling this for mutations.

  1. breaking the PR up

Breaking up the PR might help to land it, but also feel free to send a bunch of commits in one PR so that we can give high-level feedback, including how best to break it up.

I'm not aware of a use-case that would require Subscriptions to have their own fragments

Ah, okay, not sure why I assumed this.

In a small / short example app with subscriptions I found that they were almost useful for deciding whether I needed to update (dispose / create new sub) the subscription. e.g.

if (!deepEquals(nextSub.props, currentSub.props)) {
  /* dispose current sub / create next sub */
} else {
  /* do nothing, current sub is fine */
}

This ended up not being sufficient though as I ended up having other requirements of when to be subscribing / not subscribing so I still had to write specific logic around managing their lifecycle. e.g. 'subscribe to sendMessage only if channel.joined === true.

Generally, subscriptions should be able to reuse all the utilities for constructing mutations queries. Let's discuss specific logic in the PR, but what you described sounds right.

I'll have to look it over again, but I didn't think RelayMutationQuery could be reused since mutations build up a new query by working on the fat query and the mutation configs. With subscriptions the user supplies a query, not a fat query, so no intersection work is done. That said I found I still needed to modify the subscription query slightly due to mutation configs (add __typename to edge for instance).


I'll look over the code again and update as appropriate.

shogs commented

+1

+1

+1

Currently evaluating Relay and this came up as a requirement. So, real-time updates are not something that's implemented yet? Or are there any demos / sample code that do have this implemented somehow?

We aren't actively working on supporting subscriptions in open-source. That said, all the important pieces exist to implement real-time subscriptions if you're willing to put in some work. The simplest approach would be to set up something outside of Relay to listen for updates from the server (e.g. via a websocket), and then tell Relay that the data has changed by using e.g. RelayStore.getStoreData().handleQueryPayload(query, payload). Note that you need a query so that Relay can interpret the data - you can construct one with

Relay.createQuery(Relay.QL`query Foo { ... }`, {var: 'value'})

@josephsavona kindof thinking out loud, but is there a way today for me to give a query and query vars, relay force fetches the data based on that query and merges it with the local cache.

above would let me just send ids in the websocket payload and then relay can just fetch the required data using the query config. wondering if this approach is an anti pattern.

@chirag04 you can:

const node = Relay.QL`
  query {
    node(id: $channelId) {
      ... on Channel {
        joined
      }
    }
  }
`;

const query = Relay.createQuery(node, {channelId});
Relay.Store.primeCache({query}, readyState => {
  // do stuff on readyState if you care
});

Replace primeCache with (I think) forceFetch to ignore the cache and send the query in total.

You can also read from the store directly:

Relay.Store.readQuery(query);

There are also methods for reading a fragment and observing a query / fragment too but I haven't used them. I've used the above primeCache and readQuery in order to implement an onEnter handler in relay-react-router (in this case checking if a user has joined a channel or not before routing to the channel data).

Lots of cool stuff here: https://github.com/facebook/relay/blob/master/src/store/RelayEnvironment.js (exposed on Relay.Store).

@timhaines I amended my comment - we are experimenting with subscriptions and real-time updates internally (more info), but the core team is not currently working on providing a public API for subscriptions in open-source. This is in large part because it's possible to use pieces of Relay API to implement this in userspace as I outlined.

@josephsavona in which format payload should be specified for RelayStore.getStoreData().handleQueryPayload(query, payload). Can you please provide an example as there is no reference for above in docs. Even if I try to provide same query that my component has and this.props.rootQ in payload, in next render cycle it nullifies all data.

Example:

export default Relay.createContainer(ChatComponent, {
  initialVariables: {
    userId: null
  },
  fragments: {
    rootQ: () => Relay.QL `
      fragment on User {
        id
        email
        name
        picture
        chats {
          message
        }
      }
    `
  }
})

addRealtimeChat(chatMessage) {
  const query = Relay.createQuery(Relay.QL `
    query {
      user(id: $userId) {
        id
        email
        name
        picture
        chats {
          message
        }
      }
    }
  `, {
    userId: Constants.userId
  });
  const payload = this.props.rootQ;
  payload.chats.push({message: chatMessage});
  Relay.Store.getStoreData().handleQueryPayload(query, payload);
}

Okay got it working now by making payload object manually instead of directly using this.props.rootQ.

addRealtimeChat(chatMessage) {
  const query = Relay.createQuery(Relay.QL `
    query {
      user(id: $userId) {
        id
        email
        name
        picture
        chats {
          message
        }
      }
    }
  `, {
    userId: Constants.userId
  });
  const payload = {
    user: {
      id: this.props.rootQ.id
      email: this.props.rootQ.email
      name: this.props.rootQ.name
      picture: this.props.rootQ.picture
      chats: this.props.rootQ.chats
    }
  };
  payload.chats.push({message: chatMessage});
  Relay.Store.getStoreData().handleQueryPayload(query, payload);
}

I amended my comment - we are experimenting with subscriptions and real-time updates internally (more info), but the core team is not currently working on providing a public API for subscriptions in open-source. This is in large part because it's possible to use pieces of Relay API to implement this in userspace as I outlined.

@josephsavona out of curiosity, why are subscriptions not on the agenda to support as part of Relay's core api? Or is just a matter of priorities?

One of the main reasons why I would adopt REST instead of GraphQL at the moment is since there doesn't seem to be a solution out there that's dealt with consuming GraphQL in a reactive web app (in a consistent way). Maybe it's not that big a deal since it can be done, but it would be great to keep under the umbrella of Relay for consistency and simplicity. It also seems like a very common use case so would add a lot of value.

why are subscriptions not on the agenda to support as part of Relay's core api? Or is just a matter of priorities?

@sampeka We do plan to support subscriptions as part of the API; what I was referring to is that we don't intend to provide a high-level RelayMutation-style API as this would increase the framework size and complexity. Instead, we plan to focus on providing core primitives so that things like real-time subscriptions can be implemented in user space. There are a variety of ways that developers may need to push updates from their backend to the client, and we don't want to restrict this too early by imposing the requirements of a network layer, etc.

As I mentioned, it's possible to implement subscriptions today by calling into the handleUpdatePayload API. Ideally, the community could help inform what the core API described at #559 should look like.

We have realtime working within our relay application, I will release how we did it within a week, it isn't very fancy and not as powerful as subscription but it's a first step, and something that could turn into subscriptions.

In our Relay application (with a Ruby On Rails/GraphQL backend) we simulated subscriptions using Pusher, on both backend and frontend.

@miketamis @caiosba Great to hear!

I'm also implementing this in my app. Is it possible to modify one item in a connection at the moment without having to create a query containing the whole connection ?

The individual item is a node, right? You can definitely construct a node
query to modify it in that case.

On Sun, May 29, 2016, 1:42 PM Marc-Andre Giroux notifications@github.com
wrote:

I'm also implementing this in my app. Is it possible to modify one item in
a a connection at the moment without having to create a query containing
the whole connection ?

โ€”
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
#541 (comment), or mute
the thread
https://github.com/notifications/unsubscribe/ABv9N7lB6nMo8gp0kvU9a7cmSAn66XBlks5qGdAbgaJpZM4GaY63
.

good point @NevilleS! Node Interface to the rescue!

@miketamis did you ever get a chance to release a write up of how you're handling real-time data? ๐Ÿ˜ƒ

I have a similar example as @shahankit but with a twist.

I'm trying to add a new node to a previously fetched connection (my terminology could be wrong here). A single location can have multiple orders. If a new order is added to a location a websocket payload is received with the new order data.

I want to append the new order to the existing list of orders.

I've tried all kinds of different permutations of the following:

const locationQuery = Relay.createQuery(Relay.QL `
    query {
      node(id: $locationId) {
        ... on Location {
          id
          name
          orders(first: 50) {
            edges {
              node {
                id
                type
                name
              }
            }
          }
        }
      }
    }
  `, {
    locationId: this.props.location.id
  });

  const payload = {
    node: {
      __typename: "Location",
      id: this.props.location.id,
      name: this.props.location.name,
      orders: this.props.location.orders
    }
  };

  var newOrder = {
    id: websocketPayload.id,
    type: websocketPayload.type,
    name: websocketPayload.name
  }

  payload.node.test.edges.push({ node : newOrder })
  Relay.Store.getStoreData().handleQueryPayload(locationQuery, payload);
  // Relay.Store.forceFetch({locationQuery}, readyState => {})

The above code results in this.props.location.orders becoming undefined

If if I run Relay.Store.forceFetch({locationQuery}, readyState => {}) and inspect the outgoing query it seems like Relay is adding an alias to the orders field in a format similar to _orders2RjKo3

I suspect that Relay is expecting the payload to contain the key _orders2RjKo3 and because it doesn't exist it sets this.props.location.orders = null as a result.

Assuming I am correct, how do I work around this? Is there a way to retrieve the value for the orders key that Relay is expecting in the payload object passed to Relay.Store.getStoreData().handleQueryPayload(locationQuery, payload)?

I'd love to hear if anyone has an alternative way of handling this without using forceFetch to retrieve an entire list of orders.

@alexanderlamb the _orders2RjKo3 you see in there is the serializationKey for this connection. (Hashed based on calls and values I believe).

If you know your connection's dataId already, here's a way you could find that key, knowing that.

const dataId = yourConnectionId;
const storeData = Relay.Store.getStoreData();
const recordStore = storeData.getRecordStore();
recordStore.getPathToRecord(dataId).node.getSerializationKey();

Actually since you have the query already, you might be able to find the connection and call getSerializationKey() on the node to get it too.

There might be a better way to get it ๐Ÿ’ญ Let me know if the query works if you use the serializationKey in there!

@xuorig
To confirm the dataId of my connection would be found at this.props.location.orders.__dataID__ correct? Or am I looking in the wrong place?

const dataId = this.props.location.orders.__dataID__;
console.log(dataId) // returns "client:-3139241862_first(300)"
const storeData = Relay.Store.getStoreData();
const recordStore = storeData.getRecordStore();
const record = recordStore.getPathToRecord(dataId);
console.log(record) // returns null
const key = record.node.getSerializationKey();

@alexanderlamb I just realized the dataID in the store and the dataID in the props are different. I'm a bit confused at this point. If you remove the last part _first(300) you'll get the right record, but this is beginning to be super hacky ๐Ÿ˜ฟ

getPathToRecord('client:-3139241862')

Edit: Cleaner way of finding the ID.

const storeData = Relay.Store.getStoreData();
const recordStore = storeData.getRecordStore();

// Get the root call id ( example viewer )
const rootID = recordStore.getDataID('viewer');

// Find the connection dataID using the storageKey
const recordID = recordStore.getLinkedRecordID(rootID, 'todos{status:"any"}');

// Get Path and serializationKey
const record = recordStore.getPathToRecord(recordID);
const key = record.node.getSerializationKey();

Here's another way you could do it:

const locationQuery = Relay.createQuery(Relay.QL `
    query {
      node(id: $locationId) {
        ... on Location {
          id
          name
          orders(first: 50) {
            edges {
              node {
                id
                type
                name
              }
            }
          }
        }
      }
    }
  `, {
    locationId: this.props.location.id
  });

// Find the field corresponding to your connection by going through the AST

locationQuery.getChildren()[2].getSerializationKey(); // Find the actual index

@alexanderlamb I've not troed this on connection types but there's one more thing you can try, not sure but it worked for me.

const locationQuery = Relay.createQuery(Relay.QL `
    query {
      node(id: $locationId) {
        ... on Location {
          id
          name
          orders(first: 50) {
            edges {
              node {
                id
                type
                name
              }
            }
          }
        }
      }
    }
  `, {
    locationId: this.props.location.id
  });

  const orders = {
    edges: this.props.location.orders.slice()
  }

  var newOrder = {
    id: websocketPayload.id,
    type: websocketPayload.type,
    name: websocketPayload.name
  }

  orders.edges.push({ node : newOrder })

  const payload = {
    node: {
      __typename: "Location",
      id: this.props.location.id,
      name: this.props.location.name,
      orders: orders
    }
  };

@alexanderlamb It sounds like you're trying to do a mutation (adding an item to a list). For that, you can use handleUpdatePayload.

Again, remember that handleQueryPayload and handleUpdatePayload are internal APIs. We don't recommend using them for most cases (hence they'll remain undocumented for a while), but if you need real-time subscriptions today and you're willing to do a bit of digging, try out those APIs and let us know how they work for you.

What is the current status of this, does real-time updates are supported yet? If so, where can I find the related docs?

Last half of this talk brings up this topic http://youtu.be/ViXL0YQnioU

@m64253 Cool, @stream and @defer also seems awesome. Any eta for any of these to be released?
Is there any interim solution for implementing live updates in the meantime?

Cool, @stream and @defer also seems awesome. Any eta for any of these to be released?

@papigers These are experimental features that we are still exploring. For an interim approach to real-time subscriptions, my comment in this thread.

I got real-time working by using the mutation api.
I plan on releasing the code (and example) when I have cleaned it up a little bit.

The current api allows you to have a similar api as mutations.

new AddTodoSubscription({ viewer: this.props.viewer });

class AddTodoSubscription extends RelaySubscriptions.Subscription {
  static fragments = {
    viewer: () => Relay.QL`
    fragment on User {
      id
      totalCount
    }`,
  };
  getSubscription() {
    return Relay.QL`subscription {
      addTodoSubscription {
        clientMutationId
        todoEdge {
          __typename
          node {
            __typename
            id
            text
            complete
          }
        }
        viewer {
          id
          totalCount
        }
      }
    }`;
  }
  getVariables() {
    return {};
  }
  getConfigs() {
    return [{
      type: 'RANGE_ADD',
      parentName: 'viewer',
      parentID: this.props.viewer.id,
      connectionName: 'todos',
      edgeName: 'todoEdge',
      rangeBehaviors: () => 'append',
    }];
  }
}

PoC

taion commented

Is this an official Facebook thing? ๐Ÿ˜„

@taion Nope, I'm just playing in the wild ๐Ÿ˜„

Edit
I'm not an FB employee

nodkz commented

@edvinerikson incredible thing!

I planned to implement it after finishing graphql-compose as middleware on the server side for schema. And on the client side via react-relay-network-layer as middleware for websockets for next major release.

So I am looking forward to your code release. You'll save me a lot of time by your working solution. You are my idol on this week!

@edvinerikson , I'd love to see how you've implemented what you have, even before a release. I'm relatively new to relay, and I am not familiar with the source yet. I am about to attempt to implement subscriptions in relay myself. Seeing your modifications would help me see the effected parts of relay, and allow me to make the same or similar changes to get a temporary solution into my project much much faster.

Thanks and let me know!

An initial version is available at edvinerikson/relay-subscriptions. Feel free to do whatever you want with it (PRs very welcome ๐Ÿ˜„ ). I am happy to answer any questions you have about it in the new repo.

taion commented

Given that #1298 is closed and that relay-subscriptions is a library, is there anything additional to track here?

@taion We get a fair number of questions about this, so I think leaving this open for now makes sense. I've updated the description though, to make it clear that we are not actively pursuing a full subscriptions API within the core.

nodkz commented

@josephsavona if not subscriptions, then I think you are almost ready to release 1.0.0 with new features, better performance and new mutation api.

Most of all I am waiting new mutation api, cause all other things are quite comfortable. For me Relay is the best store keeper, than ReduxAppolo things.

taion commented

@josephsavona

Looking through #1298 and discussing with @edvinerikson โ€“ would you be okay with merging the scaffolding for subscription support in #1298?

I mean specifically https://github.com/facebook/relay/pull/1298/files#diff-320b6df8cf530a681d201c75772401eaR163, https://github.com/facebook/relay/pull/1298/files#diff-3a2c3b1ea174f413b5118b1aac4ecc2eR115, and an skeletal implementation of RelayEnvironment#subscribe that just throws.

This would allow actually implementing subscriptions in user-space, but still maintaining first-class API support.

Right now, with relay-subscriptions, an additional HoC is required to inject subscription support into components, which feels unnecessary given that there already is a Relay environment and a Relay container.

@taion Is the main reason for adding those to avoid an extra HOC for those components that have subscriptions? If that's the case, this can probably be handled purely in user space. You could, for example, create a RelaySubscriptionContainer.create(Component, subscriptions, spec) function that delegates to Relay.createContainer (so that users only have to write one wrapper function instead of two). The extra HOC should be trivial in practice, given that relatively few containers would have subscriptions.

Please let me know if I'm overlooking something though!

taion commented

That works.

One more thing โ€“ it looks like babel-relay-plugin has a special carve-out for clientSubscriptionId on subscription payloads.

It seems easier all around to keep track of identifying subscriptions in the network layer. There's no bookkeeping like with mutation queues I'm aware of that requires clientSubscriptionId at the library level.

In fact, #1298 doesn't implement clientSubscriptionId.

Am I missing any benefit that in fact does obtain from using clientSubscriptionId?

I did not include clientSubscriptionId, bc i ended up just handling it in the network layer ( like josephsavona pointed out it could be done, earlier in this thread: #541 (comment)). In my full implementation, it was only used in the network layer to the network layer to correlate requests and responses for primus ( this may not be necessary for other transports that handle this correlation for you, like socket.io).

jg123 commented

I am toying with the idea of transforming our existing mutation payloads into subscription payloads. That way, only one payload would need to be maintained instead of two. I am currently able to take a mutation class and either extract a RelayMutationQuery object or the actual graphql string. Neither of these representations get me to the point where I can transform it into the object like that which is created when passing a template string into the Relay.QL function which can be used to create a subscription object from the RelayQuery.Subscription.create method. Any ideas? I can paste code examples if you need more context for what I am attempting.

@jg123 quick note for now check out http://npmrepo.com/relay-subscriptions or http://npmrepo.com/primus-graphql

I am using primus-graphql in prod for https://codeshare.io

jg123 commented

Thanks @tjmehta. I have been borrowing heavily from the relay-subscriptions project, and primus looks nice and concise. What I really want to do is use my current mutation payloads as subscription payloads, so I only need to maintain the mutations.

If you use fragments you can share that fragment between the mutation and
subscription. ${TodoMutation.getFragment('todo')}

On Fri, 18 Nov 2016 at 23:09, Josh Geller notifications@github.com wrote:

Thanks @tjmehta https://github.com/tjmehta. I have been borrowing
heavily from the relay-subscriptions project, and primus looks nice and
concise. What I really want to do is use my current mutation payloads as
subscription payloads, so I only need to maintain the mutations.

โ€”
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#541 (comment), or mute
the thread
https://github.com/notifications/unsubscribe-auth/ADtdjD3s2CBgfJlhrvvrdtIfI3g5xKkAks5q_iItgaJpZM4GaY63
.

jg123 commented

Thanks @edvinerikson. Maybe a mutation doesn't translate to a subscription. The fat query intersection is dynamic based on previous data retrieved and doesn't make sense in the context of a more static subscription payload.

I'm closing this conversation thread.

Great news is that Relay Modern supports both GraphQL Subscriptions and "live" queries via an Observable based network layer!