question: how can i I delay a reply until an event occurs.
Closed this issue · 1 comments
is there an possibility to delay a stream reply not by a fixed or random amount of time, but instead send the next stream reply when a message is received in a different handler?
for example i have an unary endpoint and a streaming endpoint, and the streaming endpoint sends the replies to requests to the unary endpoint, so anytime the unary endpoint receives a message I want to trigger the next reply in the streaming endpoint.
so instead of this:
grpcMock.register(....)
.withRequest(...).willReturn(stream(
response(...)
.withFixedDelay(1))
.and(...).withFixedDelay(100))
the stream reply could be delayed until a future is resolved
future=CompletableFuture()
grpcMock.register(....)
.withRequest(...).willReturn(stream(
response(...)
.withFixedDelay(1))
.and(...).delayUntil(future))
and in my unary mock i then resolve the future when I get a request (also not sure how to best do that)
anyway its just an idea, is that something that would be easy to add to grpcmock, or is there even already a way to achieve what I want?
Hi, at the moment there's no such API and at this point I kinda see it as a very niche use case to implement it natively. I think you should be able to achieve this using the willProxyTo
methods:
withRequest(...).willProxyTo((request, responseObserver) -> {
future.thenRun(() -> {
responseObserver.onNext(response);
responseObserver.onCompleted();
})
})
And adapt the logic inside to any ordering and/or amount the messages you want to send before calling onCompleted()
Ie. if you're submitting the message to multiple clients and each client has it's own future that it marks as completed on receiving the request:
withRequest(...).willProxyTo((request, responseObserver) -> {
client1Future.thenRun(() -> responseObserver.onNext(client1Response));
client2Future.thenRun(() -> responseObserver.onNext(client2Response));
...
CompletableFuture.allOf(client1Future, client1Future, ..).thenRun(() -> responseObserver.onCompleted());
})