confluentinc/kafka-rest

Timeout error with a very large data paylod

Purus opened this issue · 1 comments

Purus commented

I am using Apache HttpClient v4 for performing POST request on /v3/clusters/<clusterId>/topics/<topic>/records.

For smaller data value, it all works fine as expected. but I when I perform a POST request with around 90MB of data, I get the below exception in the logs of Kafka-Rest.

Strangely, I get 200 response code although the response body is {"error_code":500,"message":"Internal Server Error"}.

Is this something that can be configured in Kafka rest?

Caused by: java.util.concurrent.TimeoutException: Idle timeout expired: 30000/30000 ms
        at org.eclipse.jetty.io.IdleTimeout.checkIdleTimeout(IdleTimeout.java:171)
        at org.eclipse.jetty.io.IdleTimeout.idleCheck(IdleTimeout.java:113)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        ... 1 more
        Suppressed: java.lang.Throwable: HttpInput failure
                at org.eclipse.jetty.server.HttpInput.failed(HttpInput.java:895)
                at org.eclipse.jetty.server.HttpConnection$BlockingReadCallback.failed(HttpConnection.java:665)
                at org.eclipse.jetty.io.FillInterest.onFail(FillInterest.java:140)
                at org.eclipse.jetty.io.AbstractEndPoint.onIdleExpired(AbstractEndPoint.java:407)

Kafka is already configured to handle higher data volume as our kafka-producer works with the same larger volume data. The issue is only with the kafka-rest.

Below is the client code for POST

RequestConfig requestConfig = RequestConfig.custom()
                .setConnectTimeout(60000)
                .setStaleConnectionCheckEnabled(false)
                .build();

        try (CloseableHttpClient client = HttpClients.custom()
                .setDefaultRequestConfig(requestConfig)
                .setDefaultConnectionConfig(ConnectionConfig.custom().setBufferSize(900000).build())
                .setConnectionTimeToLive(1, TimeUnit.MINUTES)
                .setKeepAliveStrategy((response, context) -> 600000).build()) {

            HttpPost httpPost = new HttpPost(URL);

            StringEntity entity = new StringEntity(JsonUtils.toJson(record));
            httpPost.setEntity(entity);
            httpPost.setHeader("Accept", "application/json");
            httpPost.setHeader("Content-type", "application/json");

            return client.execute(httpPost);
         }
Purus commented

Below settings solved.

          - name: KAFKA_REST_CLIENT_ZK_SESSION_TIMEOUT_MS
            value: "1000000"      
          - name: KAFKA_REST_IDLE_TIMEOUT_MS
            value: "1000000"      
          - name: KAFKA_REST_METRICS_SAMPLE_WINDOW_MS
            value: "1000000"