Thanks for reporting back.
Piotrek
> On 20 Feb 2020, at 21:29, John Smith wrote:
>
> I got this response on Stack:
> https://stackoverflow.com/questions/60326869/what-does-kafka-consumer-too-many-open-files-mean/60327741#60327741
>
>
I got this response on Stack:
https://stackoverflow.com/questions/60326869/what-does-kafka-consumer-too-many-open-files-mean/60327741#60327741
On Thu, 20 Feb 2020 at 13:58, John Smith wrote:
> Ok I have 9 jobs running over 3 nodes. Most jobs are set to 1 parallelism,
> worst case 2. So let's
Ok I have 9 jobs running over 3 nodes. Most jobs are set to 1 parallelism,
worst case 2. So let's assume maximum parallelism would be 18.
I will try increase the ulimit and hopefully, we wont see it...
On Thu, 20 Feb 2020 at 04:56, Piotr Nowojski wrote:
> But it could be Kafka’s client issue
But it could be Kafka’s client issue on the Flink side (as the stack trace is
suggesting). You can just try to increase limit of opened files for Flink, or
try to identify who is opening all of those files and limit it somehow - if
it’s Kafka client indeed, maybe it can be configured to use
I think so also. But I was wondering if this was Consumer or actual Kafka
Broker. But this error displayed on the flink task node where the task was
running. The brokers looked fine at the time.
I have about a dozen topics which all are single partition except one which
is 18. So I really doubt
Hey, sorry but I know very little about the KafkaConsumer. I hope that someone
else might know more.
However, did you try to google this issue? It doesn’t sound like Flink specific
problem, but like a general Kafka issue. Also a solution might be just as
simple as bumping the limit of opened
Hi Piotr, any thoughts on this?
On Wed., Feb. 12, 2020, 3:29 a.m. Kostas Kloudas,
wrote:
> Hi John,
>
> As you suggested, I would also lean towards increasing the number of
> allowed open handles, but
> for recommendation on best practices, I am cc'ing Piotr who may be
> more familiar with the
Hi John,
As you suggested, I would also lean towards increasing the number of
allowed open handles, but
for recommendation on best practices, I am cc'ing Piotr who may be
more familiar with the Kafka consumer.
Cheers,
Kostas
On Tue, Feb 11, 2020 at 9:43 PM John Smith wrote:
>
> Just wondering
Just wondering is this on the client side in the flink Job? I rebooted the
task and the job deployed correctly on another node.
Is there a specific ulimit that we should set for flink tasks nodes?
org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at