[ https://issues.apache.org/jira/browse/KAFKA-8789?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16905652#comment-16905652 ]
Raman Gupta commented on KAFKA-8789: ------------------------------------ And the same behavior for the regular console consumer: {code:java} confluent-5.3.0 $ time bin/kafka-console-consumer <...> --from-beginning --max-messages 1 --timeout-ms 15000 [2019-08-12 16:57:04,777] ERROR Error processing message, terminating consumer process: (kafka.tools.ConsoleConsumer$) org.apache.kafka.common.errors.TimeoutException Processed a total of 0 messages 1.97user 0.23system 0:31.48elapsed 7%CPU (0avgtext+0avgdata 150260maxresident)k 0inputs+0outputs (0major+34637minor)pagefaults 0swaps{code} > kafka-console-consumer needs bigger timeout-ms setting in order to work > ----------------------------------------------------------------------- > > Key: KAFKA-8789 > URL: https://issues.apache.org/jira/browse/KAFKA-8789 > Project: Kafka > Issue Type: Bug > Components: tools > Affects Versions: 2.3.0 > Reporter: Raman Gupta > Priority: Major > > I have a topic with about 20,000 events in it. When I run the following tools > command using Kafka 2. > bin/kafka-avro-console-consumer \ > --bootstrap-server $KAFKA --property schema.registry.url=$SCHEMAREGISTRY \ > --topic $TOPICPREFIX-user-clickstream-events-ui-v2 \ > --from-beginning --max-messages 100 \ > --isolation-level read_committed --skip-message-on-error \ > --timeout-ms 15000 > I get 100 messages as expected. > However, when running the exact same command using Kafka 2.3.0 I get > org.apache.kafka.common.errors.TimeoutException, and 0 messages processed. > The version of Kafka on the server is 2.3.0. > NOTE: I am using the Confluent distribution of Kafka for the client side > tools, specifically Confluent 5.0.3 and Confluent 5.3.0. I can certainly try > to replicate with a vanilla Kafka if necessary. -- This message was sent by Atlassian JIRA (v7.6.14#76016)