Specifically, I am interested if you have any thread running
'consumerPollLoop()' [1]. There should always be one (if a worker is
assigned one of the partitions). It is possible that KafkaClient itself is
hasn't recovered from the group coordinator error (though unlikely).

https://github.com/apache/beam/blob/master/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaUnboundedReader.java#L570

On Tue, Sep 11, 2018 at 12:31 PM Raghu Angadi <rang...@google.com> wrote:

> Hi Eduardo,
>
> In case of any error, the pipeline should keep on trying to fetch. I don't
> know about this particular error. Do you see any others afterwards in the
> log?
> Couple of things you could try if the logs are not useful :
>  - login to one of the VMs and get stacktrace of java worker (look for a
> container called java-streaming)
>  - file a support bug or stackoverflow question with jobid so that
> Dataflow oncall can take a look.
>
> Raghu.
>
>
> On Tue, Sep 11, 2018 at 12:10 PM Eduardo Soldera <
> eduardo.sold...@arquivei.com.br> wrote:
>
>> Hi,
>> We have a Apache Beam pipeline running in Google Dataflow using KafkaIO.
>> Suddenly the pipeline stop fetching Kafka messages at all, as our other
>> workers from other pipelines continued to get Kafka messages.
>>
>> At the moment it stopped we got these messages:
>>
>> I  [Consumer clientId=consumer-1, groupId=genericPipe] Error sending fetch 
>> request (sessionId=1396189203, epoch=2431598) to node 3: 
>> org.apache.kafka.common.errors.DisconnectException.
>> I  [Consumer clientId=consumer-1, groupId=genericPipe] Group coordinator 
>> 10.0.52.70:9093 (id: 2147483646 rack: null) is unavailable or invalid, will 
>> attempt rediscovery
>> I  [Consumer clientId=consumer-1, groupId=genericPipe] Discovered group 
>> coordinator 10.0.52.70:9093 (id: 2147483646 rack: null)
>>
>> And then the pipeline stopped reading the messages.
>>
>> This is the KafkaIO setup  we have:
>>
>> KafkaIO.read[String,String]()
>>   .withBootstrapServers(server)
>>   .withTopic(topic)
>>   .withKeyDeserializer(classOf[StringDeserializer])
>>   .withValueDeserializer(classOf[StringDeserializer])
>>   .updateConsumerProperties(properties)
>>   .commitOffsetsInFinalize()
>>   .withoutMetadata()
>>
>>  Any help will be much appreciated.
>>
>> Best regards,
>> --
>> Eduardo Soldera Garcia
>> Data Engineer
>> (16) 3509-5555 | www.arquivei.com.br
>> <https://arquivei.com.br/?utm_campaign=assinatura-email&utm_content=assinatura>
>> [image: Arquivei.com.br – Inteligência em Notas Fiscais]
>> <https://arquivei.com.br/?utm_campaign=assinatura-email&utm_content=assinatura>
>> [image: Google seleciona Arquivei para imersão e mentoria no Vale do
>> Silício]
>> <https://arquivei.com.br/blog/google-seleciona-arquivei/?utm_campaign=assinatura-email-launchpad&utm_content=assinatura-launchpad>
>> <https://www.facebook.com/arquivei>
>> <https://www.linkedin.com/company/arquivei>
>> <https://www.youtube.com/watch?v=sSUUKxbXnxk>
>>
>

Reply via email to