RE: timeout error while connecting to Kafka

2019-08-26 Thread Eyal Pe'er
Hi,
Brief update.
I tried to run the same code, but this time I used another Kafka cluster that I 
have where the version is  0.11.
The code runs fine without the timeout exception.

In conclusion, it seems like the problem occurs only when consuming events from 
Kafka 0.9. currently, I have no idea how to solve it.
If someone was able to consume events from kafka 0.9, please let me know.

Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55BFA.C6AF8180]

From: Eyal Pe'er 
Sent: Sunday, August 25, 2019 4:34 PM
To: miki haiat 
Cc: user 
Subject: RE: timeout error while connecting to Kafka

Nope, I submitted it throw the flink job master itself by running flink run -c 
  sandbox.jar

Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55BFA.C6AF8180]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Sunday, August 25, 2019 4:21 PM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user mailto:user@flink.apache.org>>
Subject: Re: timeout error while connecting to Kafka

I'm trying to understand.
Did you submitted your jar throw the flink web UI ,
And then you got the time out error ?

On Sun, Aug 25, 2019, 16:10 Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
What do you mean by “remote cluster”?
I tried to run dockerized Flink version 
(https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html)
 on a remote machine and to submit a job that supposed to communicate with 
Kafka, but still I cannot access the topic.


Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Sunday, August 25, 2019 3:50 PM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Did you try to submit it to  remote cluster ?


On Sun, Aug 25, 2019 at 2:55 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
BTW, the exception that I see in the log is: ERROR 
org.apache.flink.runtime.rest.handler.job.JobDetailsHandler   - Exception 
occurred in REST handler…
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Sent: Sunday, August 25, 2019 2:20 PM
To: miki haiat mailto:miko5...@gmail.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: RE: timeout error while connecting to Kafka

Hi,
I removed that dependency, but it still fails.
The reason why I used Kafka 1.5.0 is because I followed a training which used 
it (https://www.baeldung.com/kafka-flink-data-pipeline).
If needed, I can change it.

I’m not sure, but maybe in order to consume events from Kafka 0.9 I need to 
connect zookeeper, instead of the bootstrap servers ?
I know that in Spark streaming we consume via zookeeper ("zookeeper.connect").
I saw that in Apache Flink-Kafka connector zookeeper.connect  only required for 
Kafka 0.8, but maybe I still need to use it ?
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 2:29 PM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you try to remove this from your pom file .
 
org.apache.flink
flink-connector-kafka_2.11
1.7.0



Is their any reason why you are using flink 1.5 and not latest release.


Best,

Miki

On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi Miki,
First, I would like to thank you for the fast response.
I recheck Kafka and it is up and running fine.
I’m still getting the same error (Timeout expired while fetching topic 
metadata).
Maybe my Flink version is wrong (Kafka version is 0.9)?


org.apache.flink
flink-core
1.5.0


org.apache.flink
flink-connector-kafka-0.11_2.11
1.5.0


org.apache.flink
flink-streaming-java_2.11
1.5.0


org.apache.flink
flink-java
1.5.0


org.apache.flink
flink-clients_2.10
1.1.4


org.apache.flink
flink-connector-kafka_2.11
1.7.0



Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 11:03 AM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you double check t

RE: timeout error while connecting to Kafka

2019-08-25 Thread Eyal Pe'er
Nope, I submitted it throw the flink job master itself by running flink run -c 
  sandbox.jar

Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B62.D5E98030]

From: miki haiat 
Sent: Sunday, August 25, 2019 4:21 PM
To: Eyal Pe'er 
Cc: user 
Subject: Re: timeout error while connecting to Kafka

I'm trying to understand.
Did you submitted your jar throw the flink web UI ,
And then you got the time out error ?

On Sun, Aug 25, 2019, 16:10 Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
What do you mean by “remote cluster”?
I tried to run dockerized Flink version 
(https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html)
 on a remote machine and to submit a job that supposed to communicate with 
Kafka, but still I cannot access the topic.


Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Sunday, August 25, 2019 3:50 PM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Did you try to submit it to  remote cluster ?


On Sun, Aug 25, 2019 at 2:55 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
BTW, the exception that I see in the log is: ERROR 
org.apache.flink.runtime.rest.handler.job.JobDetailsHandler   - Exception 
occurred in REST handler…
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Sent: Sunday, August 25, 2019 2:20 PM
To: miki haiat mailto:miko5...@gmail.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: RE: timeout error while connecting to Kafka

Hi,
I removed that dependency, but it still fails.
The reason why I used Kafka 1.5.0 is because I followed a training which used 
it (https://www.baeldung.com/kafka-flink-data-pipeline).
If needed, I can change it.

I’m not sure, but maybe in order to consume events from Kafka 0.9 I need to 
connect zookeeper, instead of the bootstrap servers ?
I know that in Spark streaming we consume via zookeeper ("zookeeper.connect").
I saw that in Apache Flink-Kafka connector zookeeper.connect  only required for 
Kafka 0.8, but maybe I still need to use it ?
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 2:29 PM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you try to remove this from your pom file .
 
org.apache.flink
flink-connector-kafka_2.11
1.7.0



Is their any reason why you are using flink 1.5 and not latest release.


Best,

Miki

On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi Miki,
First, I would like to thank you for the fast response.
I recheck Kafka and it is up and running fine.
I’m still getting the same error (Timeout expired while fetching topic 
metadata).
Maybe my Flink version is wrong (Kafka version is 0.9)?


org.apache.flink
flink-core
1.5.0


org.apache.flink
flink-connector-kafka-0.11_2.11
1.5.0


org.apache.flink
flink-streaming-java_2.11
1.5.0


org.apache.flink
flink-java
1.5.0


org.apache.flink
flink-clients_2.10
1.1.4


org.apache.flink
flink-connector-kafka_2.11
1.7.0



Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 11:03 AM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you double check that the kafka instance is up ?
The code looks fine.


Best,

Miki

On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi,
I'm trying to consume events using Apache Flink.
The code is very basic, trying to connect the topic split words by space and 
print it to the console. Kafka version is 0.9.
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.serialization.SimpleStringSchema;

import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
import org.apache.flink.util.Collector

Re: timeout error while connecting to Kafka

2019-08-25 Thread miki haiat
I'm trying to understand.
Did you submitted your jar throw the flink web UI ,
And then you got the time out error ?

On Sun, Aug 25, 2019, 16:10 Eyal Pe'er  wrote:

> What do you mean by “remote cluster”?
>
> I tried to run dockerized Flink version (
> https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html)
> on a remote machine and to submit a job that supposed to communicate with
> Kafka, but still I cannot access the topic.
>
>
>
>
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* miki haiat 
> *Sent:* Sunday, August 25, 2019 3:50 PM
> *To:* Eyal Pe'er 
> *Cc:* user@flink.apache.org
> *Subject:* Re: timeout error while connecting to Kafka
>
>
>
> Did you try to submit it to  remote cluster ?
>
>
>
>
>
> On Sun, Aug 25, 2019 at 2:55 PM Eyal Pe'er  wrote:
>
> BTW, the exception that I see in the log is: ERROR
> org.apache.flink.runtime.rest.handler.job.JobDetailsHandler   - Exception
> occurred in REST handler…
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* Eyal Pe'er 
> *Sent:* Sunday, August 25, 2019 2:20 PM
> *To:* miki haiat 
> *Cc:* user@flink.apache.org
> *Subject:* RE: timeout error while connecting to Kafka
>
>
>
> Hi,
>
> I removed that dependency, but it still fails.
>
> The reason why I used Kafka 1.5.0 is because I followed a training which
> used it (https://www.baeldung.com/kafka-flink-data-pipeline).
>
> If needed, I can change it.
>
>
>
> I’m not sure, but maybe in order to consume events from Kafka 0.9 I need
> to connect zookeeper, instead of the bootstrap servers ?
>
> I know that in Spark streaming we consume via zookeeper
> ("zookeeper.connect").
>
> I saw that in Apache Flink-Kafka connector zookeeper.connect  only
> required for Kafka 0.8, but maybe I still need to use it ?
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* miki haiat 
> *Sent:* Thursday, August 22, 2019 2:29 PM
> *To:* Eyal Pe'er 
> *Cc:* user@flink.apache.org
> *Subject:* Re: timeout error while connecting to Kafka
>
>
>
> Can you try to remove this from your pom file .
>
>  
>
> org.apache.flink
>
> flink-connector-kafka_2.11
>
> 1.7.0
>
> 
>
>
>
>
>
> Is their any reason why you are using flink 1.5 and not latest release.
>
>
>
>
>
> Best,
>
>
> Miki
>
>
>
> On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er  wrote:
>
> Hi Miki,
>
> First, I would like to thank you for the fast response.
>
> I recheck Kafka and it is up and running fine.
>
> I’m still getting the same error (Timeout expired while fetching topic
> metadata).
>
> Maybe my Flink version is wrong (Kafka version is 0.9)?
>
>
>
> 
>
> org.apache.flink
>
> flink-core
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-connector-kafka-0.11_2.11
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-streaming-java_2.11
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
>     flink-java
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-clients_2.10
>
> 1.1.4
>
> 
>
> 
>
> org.apache.flink
>
> flink-connector-kafka_2.11
>
> 1.7.0
>
> 
>
>
>
>
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* miki haiat 
> *Sent:* Thursday, August 22, 2019 11:03 AM
> *To:* Eyal Pe'er 
> *Cc:* user@flink.apache.org
> *Subject:* Re: timeout error while connecting to Kafka
>
>
>
> Can you double check that the kafka instance is up ?
> The code looks fine.
>
>
>
>
>
> Best,
>
>
>
> Miki
>
>
>
> On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
> wrote:
>
> Hi,
>
> I'm trying to consume events using Apache Flink.
>
> The code is very basic, trying to connect the topic split words by space
> and print it to the console. Kafka version is 0.9.
>
> import org.apache.flink.api.common.functions.FlatMapFunction;
>
> import org.apache.flink.api.common.serialization.SimpleStringSchema;
>
>
>
> import org.apache.flink.streaming.api.datastream.DataStream;
>
> impo

RE: timeout error while connecting to Kafka

2019-08-25 Thread Eyal Pe'er
What do you mean by “remote cluster”?
I tried to run dockerized Flink version 
(https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html)
 on a remote machine and to submit a job that supposed to communicate with 
Kafka, but still I cannot access the topic.


Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat 
Sent: Sunday, August 25, 2019 3:50 PM
To: Eyal Pe'er 
Cc: user@flink.apache.org
Subject: Re: timeout error while connecting to Kafka

Did you try to submit it to  remote cluster ?


On Sun, Aug 25, 2019 at 2:55 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
BTW, the exception that I see in the log is: ERROR 
org.apache.flink.runtime.rest.handler.job.JobDetailsHandler   - Exception 
occurred in REST handler…
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Sent: Sunday, August 25, 2019 2:20 PM
To: miki haiat mailto:miko5...@gmail.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: RE: timeout error while connecting to Kafka

Hi,
I removed that dependency, but it still fails.
The reason why I used Kafka 1.5.0 is because I followed a training which used 
it (https://www.baeldung.com/kafka-flink-data-pipeline).
If needed, I can change it.

I’m not sure, but maybe in order to consume events from Kafka 0.9 I need to 
connect zookeeper, instead of the bootstrap servers ?
I know that in Spark streaming we consume via zookeeper ("zookeeper.connect").
I saw that in Apache Flink-Kafka connector zookeeper.connect  only required for 
Kafka 0.8, but maybe I still need to use it ?
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 2:29 PM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you try to remove this from your pom file .
 
org.apache.flink
flink-connector-kafka_2.11
1.7.0



Is their any reason why you are using flink 1.5 and not latest release.


Best,

Miki

On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi Miki,
First, I would like to thank you for the fast response.
I recheck Kafka and it is up and running fine.
I’m still getting the same error (Timeout expired while fetching topic 
metadata).
Maybe my Flink version is wrong (Kafka version is 0.9)?


org.apache.flink
flink-core
1.5.0


org.apache.flink
flink-connector-kafka-0.11_2.11
1.5.0


org.apache.flink
flink-streaming-java_2.11
1.5.0


org.apache.flink
flink-java
1.5.0


org.apache.flink
flink-clients_2.10
1.1.4


org.apache.flink
flink-connector-kafka_2.11
1.7.0



Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B5F.A935BE30]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 11:03 AM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you double check that the kafka instance is up ?
The code looks fine.


Best,

Miki

On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi,
I'm trying to consume events using Apache Flink.
The code is very basic, trying to connect the topic split words by space and 
print it to the console. Kafka version is 0.9.
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.serialization.SimpleStringSchema;

import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
import org.apache.flink.util.Collector;
import java.util.Properties;

public class KafkaStreaming {

public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();

Properties props = new Properties();
props.setProperty("bootstrap.servers", "kafka servers:9092...");
props.setProperty("group.id<http://group.id>", "flinkPOC");
FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>("topic", 
new SimpleStringSchema(), props);

DataStream dataStream = env.addSource(consumer);

DataStream wordDa

Re: timeout error while connecting to Kafka

2019-08-25 Thread miki haiat
Did you try to submit it to  remote cluster ?


On Sun, Aug 25, 2019 at 2:55 PM Eyal Pe'er  wrote:

> BTW, the exception that I see in the log is: ERROR
> org.apache.flink.runtime.rest.handler.job.JobDetailsHandler   - Exception
> occurred in REST handler…
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* Eyal Pe'er 
> *Sent:* Sunday, August 25, 2019 2:20 PM
> *To:* miki haiat 
> *Cc:* user@flink.apache.org
> *Subject:* RE: timeout error while connecting to Kafka
>
>
>
> Hi,
>
> I removed that dependency, but it still fails.
>
> The reason why I used Kafka 1.5.0 is because I followed a training which
> used it (https://www.baeldung.com/kafka-flink-data-pipeline).
>
> If needed, I can change it.
>
>
>
> I’m not sure, but maybe in order to consume events from Kafka 0.9 I need
> to connect zookeeper, instead of the bootstrap servers ?
>
> I know that in Spark streaming we consume via zookeeper
> ("zookeeper.connect").
>
> I saw that in Apache Flink-Kafka connector zookeeper.connect  only
> required for Kafka 0.8, but maybe I still need to use it ?
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* miki haiat 
> *Sent:* Thursday, August 22, 2019 2:29 PM
> *To:* Eyal Pe'er 
> *Cc:* user@flink.apache.org
> *Subject:* Re: timeout error while connecting to Kafka
>
>
>
> Can you try to remove this from your pom file .
>
>  
>
> org.apache.flink
>
> flink-connector-kafka_2.11
>
> 1.7.0
>
> 
>
>
>
>
>
> Is their any reason why you are using flink 1.5 and not latest release.
>
>
>
>
>
> Best,
>
>
> Miki
>
>
>
> On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er  wrote:
>
> Hi Miki,
>
> First, I would like to thank you for the fast response.
>
> I recheck Kafka and it is up and running fine.
>
> I’m still getting the same error (Timeout expired while fetching topic
> metadata).
>
> Maybe my Flink version is wrong (Kafka version is 0.9)?
>
>
>
> 
>
> org.apache.flink
>
> flink-core
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-connector-kafka-0.11_2.11
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-streaming-java_2.11
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-java
>
> 1.5.0
>
> 
>
>     
>
>     org.apache.flink
>
> flink-clients_2.10
>
> 1.1.4
>
> 
>
> 
>
> org.apache.flink
>
> flink-connector-kafka_2.11
>
> 1.7.0
>
> 
>
>
>
>
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* miki haiat 
> *Sent:* Thursday, August 22, 2019 11:03 AM
> *To:* Eyal Pe'er 
> *Cc:* user@flink.apache.org
> *Subject:* Re: timeout error while connecting to Kafka
>
>
>
> Can you double check that the kafka instance is up ?
> The code looks fine.
>
>
>
>
>
> Best,
>
>
>
> Miki
>
>
>
> On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
> wrote:
>
> Hi,
>
> I'm trying to consume events using Apache Flink.
>
> The code is very basic, trying to connect the topic split words by space
> and print it to the console. Kafka version is 0.9.
>
> import org.apache.flink.api.common.functions.FlatMapFunction;
>
> import org.apache.flink.api.common.serialization.SimpleStringSchema;
>
>
>
> import org.apache.flink.streaming.api.datastream.DataStream;
>
> import org.apache.flink.streaming.api.environment.
> StreamExecutionEnvironment;
>
> import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
>
> import org.apache.flink.util.Collector;
>
> import java.util.Properties;
>
>
>
> public class KafkaStreaming {
>
>
>
> public static void main(String[] args) throws Exception {
>
> final StreamExecutionEnvironment env = StreamExecutionEnvironment
> .getExecutionEnvironment();
>
>
>
> Properties props = new Properties();
>
> props.setProperty("bootstrap.servers", "kafka servers:9092...");
>
> props.setProperty("group.id", "flinkPOC");
>
> FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>(
> "topic", new SimpleStringSch

RE: timeout error while connecting to Kafka

2019-08-25 Thread Eyal Pe'er
Replication factor is 1. In most of my topics this is the case.
Is it a problem to consume events from non-replicated topics ?

Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B59.352FCE00]

From: Yitzchak Lieberman 
Sent: Sunday, August 25, 2019 3:13 PM
To: Eyal Pe'er 
Cc: miki haiat ; user@flink.apache.org
Subject: Re: timeout error while connecting to Kafka

What is the topic replication factor? how many kafka brokers do you have?
I were facing the same exception when one of my brokers was down and the topic 
had no replica (replication_factor=1)

On Sun, Aug 25, 2019 at 2:55 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
BTW, the exception that I see in the log is: ERROR 
org.apache.flink.runtime.rest.handler.job.JobDetailsHandler   - Exception 
occurred in REST handler…
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B59.352FCE00]

From: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Sent: Sunday, August 25, 2019 2:20 PM
To: miki haiat mailto:miko5...@gmail.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: RE: timeout error while connecting to Kafka

Hi,
I removed that dependency, but it still fails.
The reason why I used Kafka 1.5.0 is because I followed a training which used 
it (https://www.baeldung.com/kafka-flink-data-pipeline).
If needed, I can change it.

I’m not sure, but maybe in order to consume events from Kafka 0.9 I need to 
connect zookeeper, instead of the bootstrap servers ?
I know that in Spark streaming we consume via zookeeper ("zookeeper.connect").
I saw that in Apache Flink-Kafka connector zookeeper.connect  only required for 
Kafka 0.8, but maybe I still need to use it ?
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B59.352FCE00]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 2:29 PM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you try to remove this from your pom file .
 
org.apache.flink
flink-connector-kafka_2.11
1.7.0



Is their any reason why you are using flink 1.5 and not latest release.


Best,

Miki

On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi Miki,
First, I would like to thank you for the fast response.
I recheck Kafka and it is up and running fine.
I’m still getting the same error (Timeout expired while fetching topic 
metadata).
Maybe my Flink version is wrong (Kafka version is 0.9)?


org.apache.flink
flink-core
1.5.0


org.apache.flink
flink-connector-kafka-0.11_2.11
1.5.0


org.apache.flink
flink-streaming-java_2.11
1.5.0


org.apache.flink
flink-java
1.5.0


org.apache.flink
flink-clients_2.10
1.1.4


org.apache.flink
flink-connector-kafka_2.11
1.7.0



Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B59.352FCE00]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 11:03 AM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you double check that the kafka instance is up ?
The code looks fine.


Best,

Miki

On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi,
I'm trying to consume events using Apache Flink.
The code is very basic, trying to connect the topic split words by space and 
print it to the console. Kafka version is 0.9.
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.serialization.SimpleStringSchema;

import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
import org.apache.flink.util.Collector;
import java.util.Properties;

public class KafkaStreaming {

public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();

Properties props = new Properties();
props.setProperty("bootstrap.servers", "kafka servers:9092...");
props.setProperty("group.id<http://group.id>", "flinkPOC");
FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>("topic", 
new SimpleStringSchema(), props);

DataStream dataStream = env.addSource(consumer);

DataStream w

Re: timeout error while connecting to Kafka

2019-08-25 Thread Yitzchak Lieberman
What is the topic replication factor? how many kafka brokers do you have?
I were facing the same exception when one of my brokers was down and the
topic had no replica (replication_factor=1)

On Sun, Aug 25, 2019 at 2:55 PM Eyal Pe'er  wrote:

> BTW, the exception that I see in the log is: ERROR
> org.apache.flink.runtime.rest.handler.job.JobDetailsHandler   - Exception
> occurred in REST handler…
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* Eyal Pe'er 
> *Sent:* Sunday, August 25, 2019 2:20 PM
> *To:* miki haiat 
> *Cc:* user@flink.apache.org
> *Subject:* RE: timeout error while connecting to Kafka
>
>
>
> Hi,
>
> I removed that dependency, but it still fails.
>
> The reason why I used Kafka 1.5.0 is because I followed a training which
> used it (https://www.baeldung.com/kafka-flink-data-pipeline).
>
> If needed, I can change it.
>
>
>
> I’m not sure, but maybe in order to consume events from Kafka 0.9 I need
> to connect zookeeper, instead of the bootstrap servers ?
>
> I know that in Spark streaming we consume via zookeeper
> ("zookeeper.connect").
>
> I saw that in Apache Flink-Kafka connector zookeeper.connect  only
> required for Kafka 0.8, but maybe I still need to use it ?
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* miki haiat 
> *Sent:* Thursday, August 22, 2019 2:29 PM
> *To:* Eyal Pe'er 
> *Cc:* user@flink.apache.org
> *Subject:* Re: timeout error while connecting to Kafka
>
>
>
> Can you try to remove this from your pom file .
>
>  
>
> org.apache.flink
>
> flink-connector-kafka_2.11
>
> 1.7.0
>
> 
>
>
>
>
>
> Is their any reason why you are using flink 1.5 and not latest release.
>
>
>
>
>
> Best,
>
>
> Miki
>
>
>
> On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er  wrote:
>
> Hi Miki,
>
> First, I would like to thank you for the fast response.
>
> I recheck Kafka and it is up and running fine.
>
> I’m still getting the same error (Timeout expired while fetching topic
> metadata).
>
> Maybe my Flink version is wrong (Kafka version is 0.9)?
>
>
>
> 
>
> org.apache.flink
>
> flink-core
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-connector-kafka-0.11_2.11
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-streaming-java_2.11
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-java
>
> 1.5.0
>
> 
>
>     
>
>     org.apache.flink
>
> flink-clients_2.10
>
> 1.1.4
>
> 
>
> 
>
> org.apache.flink
>
> flink-connector-kafka_2.11
>
> 1.7.0
>
> 
>
>
>
>
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* miki haiat 
> *Sent:* Thursday, August 22, 2019 11:03 AM
> *To:* Eyal Pe'er 
> *Cc:* user@flink.apache.org
> *Subject:* Re: timeout error while connecting to Kafka
>
>
>
> Can you double check that the kafka instance is up ?
> The code looks fine.
>
>
>
>
>
> Best,
>
>
>
> Miki
>
>
>
> On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
> wrote:
>
> Hi,
>
> I'm trying to consume events using Apache Flink.
>
> The code is very basic, trying to connect the topic split words by space
> and print it to the console. Kafka version is 0.9.
>
> import org.apache.flink.api.common.functions.FlatMapFunction;
>
> import org.apache.flink.api.common.serialization.SimpleStringSchema;
>
>
>
> import org.apache.flink.streaming.api.datastream.DataStream;
>
> import org.apache.flink.streaming.api.environment.
> StreamExecutionEnvironment;
>
> import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
>
> import org.apache.flink.util.Collector;
>
> import java.util.Properties;
>
>
>
> public class KafkaStreaming {
>
>
>
> public static void main(String[] args) throws Exception {
>
> final StreamExecutionEnvironment env = StreamExecutionEnvironment
> .getExecutionEnvironment();
>
>
>
> Properties props = new Properties();
>
> props.setProperty("bootstrap.servers", "kafka servers:9092...");
>
> props.setProperty("group.id",

RE: timeout error while connecting to Kafka

2019-08-25 Thread Eyal Pe'er
BTW, the exception that I see in the log is: ERROR 
org.apache.flink.runtime.rest.handler.job.JobDetailsHandler   - Exception 
occurred in REST handler…
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B55.21C20990]

From: Eyal Pe'er 
Sent: Sunday, August 25, 2019 2:20 PM
To: miki haiat 
Cc: user@flink.apache.org
Subject: RE: timeout error while connecting to Kafka

Hi,
I removed that dependency, but it still fails.
The reason why I used Kafka 1.5.0 is because I followed a training which used 
it (https://www.baeldung.com/kafka-flink-data-pipeline).
If needed, I can change it.

I’m not sure, but maybe in order to consume events from Kafka 0.9 I need to 
connect zookeeper, instead of the bootstrap servers ?
I know that in Spark streaming we consume via zookeeper ("zookeeper.connect").
I saw that in Apache Flink-Kafka connector zookeeper.connect  only required for 
Kafka 0.8, but maybe I still need to use it ?
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B55.21C20990]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 2:29 PM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you try to remove this from your pom file .
 
org.apache.flink
flink-connector-kafka_2.11
1.7.0



Is their any reason why you are using flink 1.5 and not latest release.


Best,

Miki

On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi Miki,
First, I would like to thank you for the fast response.
I recheck Kafka and it is up and running fine.
I’m still getting the same error (Timeout expired while fetching topic 
metadata).
Maybe my Flink version is wrong (Kafka version is 0.9)?


org.apache.flink
flink-core
1.5.0


org.apache.flink
flink-connector-kafka-0.11_2.11
1.5.0


org.apache.flink
flink-streaming-java_2.11
1.5.0


org.apache.flink
flink-java
1.5.0


org.apache.flink
flink-clients_2.10
1.1.4


org.apache.flink
flink-connector-kafka_2.11
1.7.0



Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B55.21C20990]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 11:03 AM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you double check that the kafka instance is up ?
The code looks fine.


Best,

Miki

On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi,
I'm trying to consume events using Apache Flink.
The code is very basic, trying to connect the topic split words by space and 
print it to the console. Kafka version is 0.9.
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.serialization.SimpleStringSchema;

import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
import org.apache.flink.util.Collector;
import java.util.Properties;

public class KafkaStreaming {

public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();

Properties props = new Properties();
props.setProperty("bootstrap.servers", "kafka servers:9092...");
props.setProperty("group.id<http://group.id>", "flinkPOC");
FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>("topic", 
new SimpleStringSchema(), props);

DataStream dataStream = env.addSource(consumer);

DataStream wordDataStream = dataStream.flatMap(new Splitter());
wordDataStream.print();
env.execute("Word Split");

}

public static class Splitter implements FlatMapFunction {

public void flatMap(String sentence, Collector out) throws 
Exception {

for (String word : sentence.split(" ")) {
out.collect(word);
}
}

}
}

The app does not print anything to the screen (although I produced events to 
Kafka).
I tried to skip the Splitter FlatMap function, but still nothing happens. SSL 
or any kind of authentication is not required from Kafka.
This is the error that I found in the logs:
2019-08-20 14:36:17,654 INFO  
org.apache.flink.runtime.executiongraph.ExecutionGraph- Source: Custom 
Source -> Flat Map -> Si

RE: timeout error while connecting to Kafka

2019-08-25 Thread Eyal Pe'er
Hi,
I removed that dependency, but it still fails.
The reason why I used Kafka 1.5.0 is because I followed a training which used 
it (https://www.baeldung.com/kafka-flink-data-pipeline).
If needed, I can change it.

I’m not sure, but maybe in order to consume events from Kafka 0.9 I need to 
connect zookeeper, instead of the bootstrap servers ?
I know that in Spark streaming we consume via zookeeper ("zookeeper.connect").
I saw that in Apache Flink-Kafka connector zookeeper.connect  only required for 
Kafka 0.8, but maybe I still need to use it ?
Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B4F.C03EE8F0]

From: miki haiat 
Sent: Thursday, August 22, 2019 2:29 PM
To: Eyal Pe'er 
Cc: user@flink.apache.org
Subject: Re: timeout error while connecting to Kafka

Can you try to remove this from your pom file .
 
org.apache.flink
flink-connector-kafka_2.11
1.7.0



Is their any reason why you are using flink 1.5 and not latest release.


Best,

Miki

On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi Miki,
First, I would like to thank you for the fast response.
I recheck Kafka and it is up and running fine.
I’m still getting the same error (Timeout expired while fetching topic 
metadata).
Maybe my Flink version is wrong (Kafka version is 0.9)?


org.apache.flink
flink-core
1.5.0


org.apache.flink
flink-connector-kafka-0.11_2.11
1.5.0


org.apache.flink
flink-streaming-java_2.11
1.5.0


org.apache.flink
flink-java
1.5.0


org.apache.flink
flink-clients_2.10
1.1.4


org.apache.flink
flink-connector-kafka_2.11
1.7.0



Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B4F.C03EE8F0]

From: miki haiat mailto:miko5...@gmail.com>>
Sent: Thursday, August 22, 2019 11:03 AM
To: Eyal Pe'er mailto:eyal.p...@startapp.com>>
Cc: user@flink.apache.org<mailto:user@flink.apache.org>
Subject: Re: timeout error while connecting to Kafka

Can you double check that the kafka instance is up ?
The code looks fine.


Best,

Miki

On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi,
I'm trying to consume events using Apache Flink.
The code is very basic, trying to connect the topic split words by space and 
print it to the console. Kafka version is 0.9.
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.serialization.SimpleStringSchema;

import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
import org.apache.flink.util.Collector;
import java.util.Properties;

public class KafkaStreaming {

public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();

Properties props = new Properties();
props.setProperty("bootstrap.servers", "kafka servers:9092...");
props.setProperty("group.id<http://group.id>", "flinkPOC");
FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>("topic", 
new SimpleStringSchema(), props);

DataStream dataStream = env.addSource(consumer);

DataStream wordDataStream = dataStream.flatMap(new Splitter());
wordDataStream.print();
env.execute("Word Split");

}

public static class Splitter implements FlatMapFunction {

public void flatMap(String sentence, Collector out) throws 
Exception {

for (String word : sentence.split(" ")) {
out.collect(word);
}
}

}
}

The app does not print anything to the screen (although I produced events to 
Kafka).
I tried to skip the Splitter FlatMap function, but still nothing happens. SSL 
or any kind of authentication is not required from Kafka.
This is the error that I found in the logs:
2019-08-20 14:36:17,654 INFO  
org.apache.flink.runtime.executiongraph.ExecutionGraph- Source: Custom 
Source -> Flat Map -> Sink: Print to Std. Out (1/1) 
(02258a2cafab83afbc0f5650c088da2b) switched from RUNNING to FAILED.
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching 
topic metadata

The Kafka’s topic has only one partition, so the topic metadata supposed to be 
very basic.
I ran Kafka and the Flink locally in order to eliminate network related issues, 
but the issue persists. So my assumption is that I’m doing something wrong…
Did you encounter such issue? Does someone have different code for consuming 
Kafka events ?

Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D55B4F.C03EE8F0]



Re: timeout error while connecting to Kafka

2019-08-22 Thread miki haiat
Can you try to remove this from your pom file .

 

org.apache.flink

flink-connector-kafka_2.11

1.7.0





Is their any reason why you are using flink 1.5 and not latest release.



Best,


Miki

On Thu, Aug 22, 2019 at 2:19 PM Eyal Pe'er  wrote:

> Hi Miki,
>
> First, I would like to thank you for the fast response.
>
> I recheck Kafka and it is up and running fine.
>
> I’m still getting the same error (Timeout expired while fetching topic
> metadata).
>
> Maybe my Flink version is wrong (Kafka version is 0.9)?
>
>
>
> 
>
> org.apache.flink
>
> flink-core
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-connector-kafka-0.11_2.11
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-streaming-java_2.11
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-java
>
> 1.5.0
>
> 
>
> 
>
> org.apache.flink
>
> flink-clients_2.10
>
> 1.1.4
>
> 
>
> 
>
> org.apache.flink
>
> flink-connector-kafka_2.11
>
> 1.7.0
>
>     
>
>
>
>
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
> *From:* miki haiat 
> *Sent:* Thursday, August 22, 2019 11:03 AM
> *To:* Eyal Pe'er 
> *Cc:* user@flink.apache.org
> *Subject:* Re: timeout error while connecting to Kafka
>
>
>
> Can you double check that the kafka instance is up ?
> The code looks fine.
>
>
>
>
>
> Best,
>
>
>
> Miki
>
>
>
> On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
> wrote:
>
> Hi,
>
> I'm trying to consume events using Apache Flink.
>
> The code is very basic, trying to connect the topic split words by space
> and print it to the console. Kafka version is 0.9.
>
> import org.apache.flink.api.common.functions.FlatMapFunction;
>
> import org.apache.flink.api.common.serialization.SimpleStringSchema;
>
>
>
> import org.apache.flink.streaming.api.datastream.DataStream;
>
> import org.apache.flink.streaming.api.environment.
> StreamExecutionEnvironment;
>
> import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
>
> import org.apache.flink.util.Collector;
>
> import java.util.Properties;
>
>
>
> public class KafkaStreaming {
>
>
>
> public static void main(String[] args) throws Exception {
>
> final StreamExecutionEnvironment env = StreamExecutionEnvironment
> .getExecutionEnvironment();
>
>
>
> Properties props = new Properties();
>
> props.setProperty("bootstrap.servers", "kafka servers:9092...");
>
> props.setProperty("group.id", "flinkPOC");
>
> FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>(
> "topic", new SimpleStringSchema(), props);
>
>
>
> DataStream dataStream = env.addSource(consumer);
>
>
>
> DataStream wordDataStream = dataStream.flatMap(new Splitter
> ());
>
> wordDataStream.print();
>
> env.execute("Word Split");
>
>
>
> }
>
>
>
> public static class Splitter implements FlatMapFunction {
>
>
>
> public void flatMap(String sentence, Collector out) throws
> Exception {
>
>
>
> for (String word : sentence.split(" ")) {
>
> out.collect(word);
>
> }
>
> }
>
>
>
> }
>
> }
>
>
>
> The app does not print anything to the screen (although I produced events
> to Kafka).
>
> I tried to skip the Splitter FlatMap function, but still nothing happens.
> SSL or any kind of authentication is not required from Kafka.
>
> This is the error that I found in the logs:
>
> 2019-08-20 14:36:17,654 INFO  org.apache.flink.runtime.executiongraph.
> ExecutionGraph- Source: Custom Source -> Flat Map -> Sink: Print
> to Std. Out (1/1) (02258a2cafab83afbc0f5650c088da2b) switched from
> RUNNING to FAILED.
>
> org.apache.kafka.common.errors.TimeoutException: Timeout expired while
> fetching topic metadata
>
>
>
> The Kafka’s topic has only one partition, so the topic metadata supposed
> to be very basic.
>
> I ran Kafka and the Flink locally in order to eliminate network related
> issues, but the issue persists. So my assumption is that I’m doing
> something wrong…
>
> Did you encounter such issue? Does someone have different code for
> consuming Kafka events ?
>
>
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>
>


RE: timeout error while connecting to Kafka

2019-08-22 Thread Eyal Pe'er
Hi Miki,
First, I would like to thank you for the fast response.
I recheck Kafka and it is up and running fine.
I’m still getting the same error (Timeout expired while fetching topic 
metadata).
Maybe my Flink version is wrong (Kafka version is 0.9)?


org.apache.flink
flink-core
1.5.0


org.apache.flink
flink-connector-kafka-0.11_2.11
1.5.0


org.apache.flink
flink-streaming-java_2.11
1.5.0


org.apache.flink
flink-java
1.5.0


org.apache.flink
flink-clients_2.10
1.1.4


org.apache.flink
flink-connector-kafka_2.11
1.7.0



Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D558F0.E155F580]

From: miki haiat 
Sent: Thursday, August 22, 2019 11:03 AM
To: Eyal Pe'er 
Cc: user@flink.apache.org
Subject: Re: timeout error while connecting to Kafka

Can you double check that the kafka instance is up ?
The code looks fine.


Best,

Miki

On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er 
mailto:eyal.p...@startapp.com>> wrote:
Hi,
I'm trying to consume events using Apache Flink.
The code is very basic, trying to connect the topic split words by space and 
print it to the console. Kafka version is 0.9.
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.serialization.SimpleStringSchema;

import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
import org.apache.flink.util.Collector;
import java.util.Properties;

public class KafkaStreaming {

public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();

Properties props = new Properties();
props.setProperty("bootstrap.servers", "kafka servers:9092...");
props.setProperty("group.id<http://group.id>", "flinkPOC");
FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>("topic", 
new SimpleStringSchema(), props);

DataStream dataStream = env.addSource(consumer);

DataStream wordDataStream = dataStream.flatMap(new Splitter());
wordDataStream.print();
env.execute("Word Split");

}

public static class Splitter implements FlatMapFunction {

public void flatMap(String sentence, Collector out) throws 
Exception {

for (String word : sentence.split(" ")) {
out.collect(word);
}
}

}
}

The app does not print anything to the screen (although I produced events to 
Kafka).
I tried to skip the Splitter FlatMap function, but still nothing happens. SSL 
or any kind of authentication is not required from Kafka.
This is the error that I found in the logs:
2019-08-20 14:36:17,654 INFO  
org.apache.flink.runtime.executiongraph.ExecutionGraph- Source: Custom 
Source -> Flat Map -> Sink: Print to Std. Out (1/1) 
(02258a2cafab83afbc0f5650c088da2b) switched from RUNNING to FAILED.
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching 
topic metadata

The Kafka’s topic has only one partition, so the topic metadata supposed to be 
very basic.
I ran Kafka and the Flink locally in order to eliminate network related issues, 
but the issue persists. So my assumption is that I’m doing something wrong…
Did you encounter such issue? Does someone have different code for consuming 
Kafka events ?

Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D558F0.E155F580]



Re: timeout error while connecting to Kafka

2019-08-22 Thread Qi Kang
The code itself is fine. Turning the app’s log level to DEBUG will give you 
more information.

BTW, please make sure that the addresses of Kafka brokers are properly resolved.


> On Aug 22, 2019, at 15:45, Eyal Pe'er  wrote:
> 
> Hi,
> 
> I'm trying to consume events using Apache Flink.
> 
> The code is very basic, trying to connect the topic split words by space and 
> print it to the console. Kafka version is 0.9.
> 
> import org.apache.flink.api.common.functions.FlatMapFunction;
> import org.apache.flink.api.common.serialization.SimpleStringSchema;
>  
> import org.apache.flink.streaming.api.datastream.DataStream;
> import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
> import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
> import org.apache.flink.util.Collector;
> import java.util.Properties;
>  
> public class KafkaStreaming {
>  
> public static void main(String[] args) throws Exception {
> final StreamExecutionEnvironment env = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>  
> Properties props = new Properties();
> props.setProperty("bootstrap.servers", "kafka servers:9092...");
> props.setProperty("group.id ", "flinkPOC");
> FlinkKafkaConsumer09 consumer = new 
> FlinkKafkaConsumer09<>("topic", new SimpleStringSchema(), props);
>  
> DataStream dataStream = env.addSource(consumer);
>  
> DataStream wordDataStream = dataStream.flatMap(new Splitter());
> wordDataStream.print();
> env.execute("Word Split");
>  
> }
>  
> public static class Splitter implements FlatMapFunction {
>  
> public void flatMap(String sentence, Collector out) throws 
> Exception {
>  
> for (String word : sentence.split(" ")) {
> out.collect(word);
> }
> }
>  
> }
> }
>  
> 
> The app does not print anything to the screen (although I produced events to 
> Kafka).
> 
> I tried to skip the Splitter FlatMap function, but still nothing happens. SSL 
> or any kind of authentication is not required from Kafka.
> 
> This is the error that I found in the logs:
> 
> 2019-08-20 14:36:17,654 INFO  
> org.apache.flink.runtime.executiongraph.ExecutionGraph- Source: 
> Custom Source -> FlatMap -> Sink: Print to Std. Out (1/1) 
> (02258a2cafab83afbc0f5650c088da2b) switched from RUNNING to FAILED.
> org.apache.kafka.common.errors.TimeoutException: Timeout expired while 
> fetching topic metadata
>  
> 
> The Kafka’s topic has only one partition, so the topic metadata supposed to 
> be very basic.
> 
> I ran Kafka and the Flink locally in order to eliminate network related 
> issues, but the issue persists. So my assumption is that I’m doing something 
> wrong…
> 
> Did you encounter such issue? Does someone have different code for consuming 
> Kafka events ?
>  
> Best regards
> Eyal Peer / Data Platform Developer
> 



Re: timeout error while connecting to Kafka

2019-08-22 Thread miki haiat
Can you double check that the kafka instance is up ?
The code looks fine.


Best,

Miki

On Thu, Aug 22, 2019 at 10:45 AM Eyal Pe'er  wrote:

> Hi,
>
> I'm trying to consume events using Apache Flink.
>
> The code is very basic, trying to connect the topic split words by space
> and print it to the console. Kafka version is 0.9.
>
> import org.apache.flink.api.common.functions.FlatMapFunction;
>
> import org.apache.flink.api.common.serialization.SimpleStringSchema;
>
>
>
> import org.apache.flink.streaming.api.datastream.DataStream;
>
> import org.apache.flink.streaming.api.environment.
> StreamExecutionEnvironment;
>
> import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
>
> import org.apache.flink.util.Collector;
>
> import java.util.Properties;
>
>
>
> public class KafkaStreaming {
>
>
>
> public static void main(String[] args) throws Exception {
>
> final StreamExecutionEnvironment env = StreamExecutionEnvironment
> .getExecutionEnvironment();
>
>
>
> Properties props = new Properties();
>
> props.setProperty("bootstrap.servers", "kafka servers:9092...");
>
> props.setProperty("group.id", "flinkPOC");
>
> FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>(
> "topic", new SimpleStringSchema(), props);
>
>
>
> DataStream dataStream = env.addSource(consumer);
>
>
>
> DataStream wordDataStream = dataStream.flatMap(new Splitter
> ());
>
> wordDataStream.print();
>
> env.execute("Word Split");
>
>
>
> }
>
>
>
> public static class Splitter implements FlatMapFunction {
>
>
>
> public void flatMap(String sentence, Collector out) throws
> Exception {
>
>
>
> for (String word : sentence.split(" ")) {
>
> out.collect(word);
>
> }
>
> }
>
>
>
> }
>
> }
>
>
>
> The app does not print anything to the screen (although I produced events
> to Kafka).
>
> I tried to skip the Splitter FlatMap function, but still nothing happens.
> SSL or any kind of authentication is not required from Kafka.
>
> This is the error that I found in the logs:
>
> 2019-08-20 14:36:17,654 INFO  org.apache.flink.runtime.executiongraph.
> ExecutionGraph- Source: Custom Source -> Flat Map -> Sink: Print
> to Std. Out (1/1) (02258a2cafab83afbc0f5650c088da2b) switched from
> RUNNING to FAILED.
>
> org.apache.kafka.common.errors.TimeoutException: Timeout expired while
> fetching topic metadata
>
>
>
> The Kafka’s topic has only one partition, so the topic metadata supposed
> to be very basic.
>
> I ran Kafka and the Flink locally in order to eliminate network related
> issues, but the issue persists. So my assumption is that I’m doing
> something wrong…
>
> Did you encounter such issue? Does someone have different code for
> consuming Kafka events ?
>
>
>
> Best regards
>
> Eyal Peer */ *Data Platform Developer
>
>
>


timeout error while connecting to Kafka

2019-08-22 Thread Eyal Pe'er
Hi,
I'm trying to consume events using Apache Flink.
The code is very basic, trying to connect the topic split words by space and 
print it to the console. Kafka version is 0.9.
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.serialization.SimpleStringSchema;

import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
import org.apache.flink.util.Collector;
import java.util.Properties;

public class KafkaStreaming {

public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();

Properties props = new Properties();
props.setProperty("bootstrap.servers", "kafka servers:9092...");
props.setProperty("group.id", "flinkPOC");
FlinkKafkaConsumer09 consumer = new FlinkKafkaConsumer09<>("topic", 
new SimpleStringSchema(), props);

DataStream dataStream = env.addSource(consumer);

DataStream wordDataStream = dataStream.flatMap(new Splitter());
wordDataStream.print();
env.execute("Word Split");

}

public static class Splitter implements FlatMapFunction {

public void flatMap(String sentence, Collector out) throws 
Exception {

for (String word : sentence.split(" ")) {
out.collect(word);
}
}

}
}

The app does not print anything to the screen (although I produced events to 
Kafka).
I tried to skip the Splitter FlatMap function, but still nothing happens. SSL 
or any kind of authentication is not required from Kafka.
This is the error that I found in the logs:
2019-08-20 14:36:17,654 INFO  
org.apache.flink.runtime.executiongraph.ExecutionGraph- Source: Custom 
Source -> Flat Map -> Sink: Print to Std. Out (1/1) 
(02258a2cafab83afbc0f5650c088da2b) switched from RUNNING to FAILED.
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching 
topic metadata

The Kafka's topic has only one partition, so the topic metadata supposed to be 
very basic.
I ran Kafka and the Flink locally in order to eliminate network related issues, 
but the issue persists. So my assumption is that I'm doing something wrong...
Did you encounter such issue? Does someone have different code for consuming 
Kafka events ?

Best regards
Eyal Peer / Data Platform Developer
[cid:image001.png@01D558D6.B0941CC0]