netty/netty-incubator-codec-http3

h2load requests are often not fully processed

DevPhilB opened this issue ยท 6 comments

At first I thought that the h2load client does not send a headers frame,
but the developer and the verbose details show that the frames are sent and no response is received from the server.
The logging does not display any errors, but the Http3WebServiceHandler does not receive a headers frame either.

HttpWebServer HTTP/3 code part:

// Configure SSL
QuicSslContext quicSslContext = QuicSslContextBuilder
        .forServer(certificate.key(), null, certificate.cert())
        .applicationProtocols(Http3.supportedApplicationProtocols()).build();
// Configure codec
ChannelHandler codec = Http3.newQuicServerCodecBuilder()
        .sslContext(quicSslContext)
        .maxIdleTimeout(5000, TimeUnit.MILLISECONDS)
        .initialMaxData(10000000)
        .initialMaxStreamDataBidirectionalLocal(1000000)
        .initialMaxStreamDataBidirectionalRemote(1000000)
        .initialMaxStreamsBidirectional(100)
        .maxRecvUdpPayloadSize(10000000)
        .maxSendUdpPayloadSize(10000000)
        .tokenHandler(InsecureQuicTokenHandler.INSTANCE)
        .handler(new ChannelInitializer<QuicChannel>() {
            @Override
            protected void initChannel(QuicChannel quicChannel) {
                // Called for each connection
                Http3ServerPushStreamManager pushStreamManager =
                        new Http3ServerPushStreamManager(quicChannel);
                quicChannel.pipeline().addLast(new Http3ServerConnectionHandler(
                        new ChannelInitializer<QuicStreamChannel>() {
                            // Called for each request-stream
                            @Override
                            protected void initChannel(QuicStreamChannel streamChannel) {
                                streamChannel.pipeline().addLast(
                                        new Http3WebServiceHandler(
                                                gatewayHost,
                                                persistencePort,
                                                authPort,
                                                imagePort,
                                                recommenderPort
                                        )
                                );
                                streamChannel.pipeline().addLast(new LoggingHandler(LogLevel.INFO));
                            }
                        },
                        pushStreamManager.controlStreamListener(),
                        null,
                        null,
                        false
                ));
                quicChannel.pipeline().addLast(new LoggingHandler(LogLevel.INFO));
            }
        }).build();
// Configure the server
try {
    Bootstrap bootstrap = new Bootstrap();
    Channel channel;
    String status = httpVersion + " web service is available on https://";
    if (gatewayHost.isEmpty()) {
        channel = bootstrap.group(bossGroup)
                .channel(NioDatagramChannel.class)
                .handler(codec)
                .bind(new InetSocketAddress(DEFAULT_WEB_PORT)).sync().channel();
        status += "localhost:" + DEFAULT_WEB_PORT + WEB_ENDPOINT;
    } else {
        channel = bootstrap.group(bossGroup)
                .channel(NioDatagramChannel.class)
                .handler(codec)
                .bind(new InetSocketAddress(webPort)).sync().channel();
        status += "web:" + webPort + WEB_ENDPOINT;
    }
    LOG.info(status);
    channel.closeFuture().sync();
} finally {
    bossGroup.shutdownGracefully();
}
HttpWebServer Log

13:32:40.916 [main] INFO  web.rest.server.HttpWebServer - HTTP/3 web service is available on https://web:4433/api/web
13:33:10.811 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2] REGISTERED
13:33:10.826 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] ACTIVE
13:33:10.826 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] USER_EVENT: io.netty.incubator.codec.quic.QuicStreamLimitChangedEvent@38c894c9
13:33:10.830 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ: [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}]
13:33:10.913 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] REGISTERED
13:33:10.913 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] ACTIVE
13:33:10.915 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] USER_EVENT: io.netty.channel.socket.ChannelInputShutdownEvent@7f905320
13:33:10.917 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] USER_EVENT: io.netty.channel.socket.ChannelInputShutdownReadComplete@15c9a6c5
13:33:10.919 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ: [id: 0xa1e9772f, QuicStreamAddress{streamId=2}]
13:33:10.922 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] READ COMPLETE
13:33:10.923 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ: [id: 0x133adc02, QuicStreamAddress{streamId=6}]
13:33:10.924 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ: [id: 0x076d49cc, QuicStreamAddress{streamId=10}]
13:33:10.925 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ COMPLETE
13:33:15.936 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} ! R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] INACTIVE
13:33:15.936 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (8e64a9a94ada3c0b18ef8ca0e420ee86a9d07269)[id: 0xfe518ab2, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} ! R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] UNREGISTERED
13:33:15.936 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] INACTIVE
13:33:15.936 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] UNREGISTERED
If you need more details, I will be happy to help.

@DevPhilB thanks for the report.

You have added the LoggingHandler after your Http3WebServiceHandler in the pipeline, that is why it does not print much.

It seems like the stream input is getting shutdown as soon as it is active:

13:33:10.913 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] ACTIVE
13:33:10.915 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] USER_EVENT: io.netty.channel.socket.ChannelInputShutdownEvent@7f905320
13:33:10.917 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x8cc4ce67, QuicStreamAddress{streamId=0}] USER_EVENT: io.netty.channel.socket.ChannelInputShutdownReadComplete@15c9a6c5

Please add the LoggingHandler on the QuicChannel as well as the stream before you add any other handler and share the logs.

Hi @NiteshKant, here you go:

10:53:23.281 [main] INFO  web.rest.server.HttpWebServer - HTTP/3 web service is available on https://web:4433/api/web
10:53:29.316 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef] REGISTERED
10:53:29.327 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] ACTIVE
10:53:29.338 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] USER_EVENT: io.netty.incubator.codec.quic.QuicStreamLimitChangedEvent@69427a7f
10:53:29.338 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ: [id: 0x53568f47, QuicStreamAddress{streamId=10}]
10:53:29.343 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ: [id: 0x390852d4, QuicStreamAddress{streamId=0}]
10:53:29.454 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] REGISTERED
10:53:29.454 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] ACTIVE
10:53:29.454 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] READ: 57B
         +-------------------------------------------------+
         |  0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f |
+--------+-------------------------------------------------+----------------+
|00000000| 01 37 00 00 51 8b 60 75 99 8f 05 8d 83 55 21 7c |.7..Q.`u.....U!||
|00000010| ff d7 50 8e 0b e2 5c 2e 3c b8 5d 79 77 9c b8 10 |..P...\.<.]yw...|
|00000020| 01 9f d1 5f 50 93 9c 54 1c 72 29 54 d3 a5 35 89 |..._P..T.r)T..5.|
|00000030| 80 ae d3 6b 81 6b f8 38 ff                      |...k.k.8.       |
+--------+-------------------------------------------------+----------------+
10:53:29.455 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] READ COMPLETE
10:53:29.455 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] USER_EVENT: io.netty.channel.socket.ChannelInputShutdownEvent@7f080552
10:53:29.455 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] USER_EVENT: io.netty.channel.socket.ChannelInputShutdownReadComplete@354a3cee
10:53:29.457 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ: [id: 0xc09b9d01, QuicStreamAddress{streamId=6}]
10:53:29.458 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ: [id: 0x7d9b6e6e, QuicStreamAddress{streamId=2}]
10:53:29.461 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} - R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] READ COMPLETE
10:53:34.471 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} ! R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] INACTIVE
10:53:34.471 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - (b2225819446b78217914db994b7a25140e6fb1dd)[id: 0x5cd2b4ef, L:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=20 cap=20]} ! R:QuicConnectionAddress{connId=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8]}] UNREGISTERED
10:53:34.471 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] INACTIVE
10:53:34.471 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] UNREGISTERED

Apologies for the delayed response, have been busy.

Its hard to say for sure what is going wrong but my guess is that the peer is closing stream with id 0 which closes an HTTP/3 connection:

10:53:29.454 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] READ: 57B
         +-------------------------------------------------+
         |  0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f |
+--------+-------------------------------------------------+----------------+
|00000000| 01 37 00 00 51 8b 60 75 99 8f 05 8d 83 55 21 7c |.7..Q.`u.....U!||
|00000010| ff d7 50 8e 0b e2 5c 2e 3c b8 5d 79 77 9c b8 10 |..P...\.<.]yw...|
|00000020| 01 9f d1 5f 50 93 9c 54 1c 72 29 54 d3 a5 35 89 |..._P..T.r)T..5.|
|00000030| 80 ae d3 6b 81 6b f8 38 ff                      |...k.k.8.       |
+--------+-------------------------------------------------+----------------+
10:53:29.455 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] READ COMPLETE
10:53:29.455 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] USER_EVENT: io.netty.channel.socket.ChannelInputShutdownEvent@7f080552
10:53:29.455 [nioEventLoopGroup-2-1] INFO  io.netty.handler.logging.LoggingHandler - [id: 0x390852d4, QuicStreamAddress{streamId=0}] USER_EVENT: io.netty.channel.socket.ChannelInputShutdownReadComplete@354a3cee

From the above, most likely the input shutdown events are generated from here indicating that a FIN was received on that stream.

Don't worry, it's great that you support open source projects in your free time ๐Ÿ‘

I've expected such an answer because the log doesn't say much more compared to the previous one.

Let's assume the peer closes the entire HTTP/3 connection with the closing of stream 0.
Why doesn't get the Http3RequestStreamInboundHandler the 57 byte frame?
And shouldn't all requests fail in this case?

Don't worry, it's great that you support open source projects in your free time ๐Ÿ‘

Its actually my day job ๐Ÿ˜œ ... I work on netty full time ๐ŸŽ‰

Let's assume the peer closes the entire HTTP/3 connection with the closing of stream 0.

I am not saying the peer closes the H3 connection, I am saying that we received a QUIC STREAM frame with a FIN bit set which triggers the close sequence within netty.

Why doesn't get the Http3RequestStreamInboundHandler the 57 byte frame?

Most likely because netty processed the close before it read data from the request stream. It is not expected for the control stream to be closed throughout the lifetime of the connection. So, if this is indeed the case then all bets are off and unread data can be dropped as the QUIC connection is closed.

And shouldn't all requests fail in this case?

Not sure about your expectations here, since the connection is closed, netty wouldn't read anything from it. If you add a handler to the QUIC channel that intercepts channelInactive() you would know when the connection is closed. You can also add a control stream listener and see whether that stream is getting closed.

Please reopen if this is still an issue with latest release.