Hi everyone,
I have a question that is not really related to Flink but maybe someone of you
can help me understanding what I am doing wrong.
I have a Flink job that processes events generated by a Java application. The
output of the Flink job is emitted on Kafka; the Java application runs a Kaf
I have to produce custom objects into kafka and read them with flink. Any
tuning advices to use kryo? Such as class registration or something like
that? Any examples?
Thanks
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink
I am trying to integrate kafka and flink.
my pom file is where {flink.version} is 0.10.2
org.apache.flink
flink-java
${flink.version}
org.apache.flink
flink-streaming-java
${flink.version}
org.apache.flink
flink-clients
${flink.version}
com.fasterxml.
objects into kafka and read them with flink. Any
tuning advices to use kryo? Such as class registration or something like
that? Any examples?
Thanks
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792
My custom object is used across all job, so it'll be part of checkpoints. Can
you point me some references with some examples?
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13802.html
Sent fro
nk-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13841.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at
Nabble.com.
Attaching jfr
flight_recording_10228245112.jfr
<http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/file/n13844/flight_recording_10228245112.jfr>
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integ
No, this is only necessary if you want to register a custom serializer itself
[1]. Also, in case you are wondering about registerKryoType() - this is only
needed as a performance optimisation.
What exactly is your problem? What are you trying to solve?
(I can't read JFR files here, and from what
to tuple just for de/serialization process?
According to jfr analysis, kryo methods are hit a lot.
[cid:image003.jpg@01D2E9E1.26D2D370]
-Original Message-
From: Nico Kruber [mailto:n...@data-artisans.com]
Sent: 20 de junho de 2017 16:04
To: user@flink.apache.org
Cc: Nuno Rafael Goncalves
Su
D370]
>
>
>
>
>
>
>
> -Original Message-
> From: Nico Kruber [mailto:n...@data-artisans.com]
> Sent: 20 de junho de 2017 16:04
> To: user@flink.apache.org
> Cc: Nuno Rafael Goncalves
> Subject: Re: Kafka and Flink integration
>
>
>
&
.
-Original Message-
From: Nico Kruber [mailto:n...@data-artisans.com]
Sent: 20 de junho de 2017 16:04
To: user@flink.apache.org
Cc: Nuno Rafael Goncalves
Subject: Re: Kafka and Flink integration
No, this is only necessary if you want to register a custom serializer
itself [1]. Also
e.org
Subject: Re: Kafka and Flink integration
I can only repeat what Gordon wrote on Friday: "It’s usually always recommended
to register your classes with Kryo [using registerKryoType()], to avoid the
somewhat inefficient classname writing.
Also, depending on the case, to decrease serializatio
Can i have pojo has composition of other pojo?
My custom object has many dependencies and in order to refactor it I must
also change another 5 classes as well.
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration
...@wedotechnologies.com)
wrote:
Can i have pojo has composition of other pojo?
My custom object has many dependencies and in order to refactor it I must
also change another 5 classes as well.
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink
Thanks, I'll try to refactor into POJOs.
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13879.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at
Nabble.com.
chive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13882.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at
Nabble.com.
ntext: http://apache-flink-user-
> mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-
> integration-tp13792p13882.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive
> at Nabble.com.
>
g-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13885.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at
Nabble.com.
p://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13885.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive at
> Nabble.com.
Greg:Can you clarify he last part?Should it be: the concrete type cannot be
known ?
Original message From: Greg Hogan Date:
6/21/17 3:10 AM (GMT-08:00) To: nragon
Cc: user@flink.apache.org Subject: Re: Kafka and Flink integration
The recommendation has been to avoid Kryo
0)
> To: nragon
> Cc: user@flink.apache.org
> Subject: Re: Kafka and Flink integration
>
> The recommendation has been to avoid Kryo where possible.
>
> General data exchange: avro or thrift.
>
> Flink internal data exchange: POJO (or Tuple, which are slightly faster
> though l
;
>>
>> --
>> View this message in context:
>> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-and-Flink-integration-tp13792p13885.html
>> Sent from the Apache Flink User Mailing List archive. mailing list archive
>> at Nabble.com.
>
-
o right? From there, the remaining
> >> pipeline can just use standard pojo serialization, which would be
> better?
> >>
> >>
> >>
> >> --
> >> View this message in context: http://apache-flink-user-
> mailing-list-archive.2336050.n4.
kafka
>> consumer will use avro, thrift or kryo right? From there, the
remaining
>> pipeline can just use standard pojo serialization, which would
be better?
>>
>>
>>
>> --
>> View this message in context:
http://ap
e type can be known.
>> >
>> >
>> >
>> >> On Jun 21, 2017, at 3:19 AM, nragon
>> > <mailto:nuno.goncal...@wedotechnologies.com>> wrote:
>> >>
>> >> So, serialization between producer ap
Hi Pankaj,
I suspect you are trying to start Flink on a cluster with Flink 0.10.1
installed?
On Sat, Feb 27, 2016 at 9:20 AM, Pankaj Kumar wrote:
> I am trying to integrate kafka and flink.
> my pom file is where {flink.version} is 0.10.2
>
>
> org.apache.flink
> flink-java
> ${fli
Yes Robert ,
i was trying to start Flink on cluster 0.10.1.
But after changing flink version to 0.10.1 , also i am getting the same
error.
On Sat, Feb 27, 2016 at 2:47 PM, Robert Metzger wrote:
> Hi Pankaj,
>
> I suspect you are trying to start Flink on a cluster with Flink 0.10.1
> installed?
Hi!
A "NoSuchMethodError" is always a sign of a version mixup. Please make sure
both versions (cluster and client) are exactly the same.
Stephan
On Sat, Feb 27, 2016 at 11:05 AM, Pankaj Kumar wrote:
> Yes Robert ,
> i was trying to start Flink on cluster 0.10.1.
>
> But after changing flink v
yes versioning was issue . Job is working fine on flink 0.10.2.
On Mon, Feb 29, 2016 at 3:15 PM, Stephan Ewen wrote:
> Hi!
>
> A "NoSuchMethodError" is always a sign of a version mixup. Please make
> sure both versions (cluster and client) are exactly the same.
>
> Stephan
>
>
> On Sat, Feb 27,
Good to hear. Thanks for letting us know!
On Mon, Feb 29, 2016 at 8:14 PM, Pankaj Kumar wrote:
> yes versioning was issue . Job is working fine on flink 0.10.2.
>
> On Mon, Feb 29, 2016 at 3:15 PM, Stephan Ewen wrote:
>
>> Hi!
>>
>> A "NoSuchMethodError" is always a sign of a version mixup. Ple
30 matches
Mail list logo