Hi,
We're trying to implement some module to help autoscale our pipeline which
is built with Flink on YARN. According to the document, the suggested
procedure seems to be:
1. cancel job with savepoint
2. start new job with increased YARN TM number and parallelism.
However, step 2 always gave er
Correction: I have the row’s RowTypeInfo at runtime before the job starts. I
don’t have RowTypeInfo at compile time.
On Oct 16, 2017, at 4:15 PM, Joshua Griffith
mailto:jgriff...@campuslabs.com>> wrote:
Hello,
I have a case class that wraps a Flink Row and I’d like to use fields from that
Row
Hello,
I have a case class that wraps a Flink Row and I’d like to use fields from that
Row in a delta iteration join condition. I only have the row’s fields after the
job starts. I can construct RowTypeInfo for the Row but I’m not sure how to add
that to Flink’s generated type information for t
Hi all,
I want to expose you my program flow.
I have the following operators:
kafka-source -> timestamp-extractor -> map -> keyBy -> window -> apply ->
LEARN -> SELECT -> process -> cassandra-sink
the LEARN and SELECT operators belong to an external library supported by
flink. LEARN is a very he
Hi all,
I am new to Apache Flink and I followed http://training.data-
artisans.com/devEnvSetup.html
to start with Flink setup and sample code demo.
I am facing issue while running WordCount.java code in eclipse IDE.
I am getting " java.lang.NoClassDefFoundError:
org/jboss/netty/logging/InternalL
I am attempting to run Flink 1.3.2 in HA mode with zookeeper.
When I run the start-cluster.sh, the job manager is not started, even though
the task manager is started. When I delved into this, I saw that the command:
ssh -n $FLINK_SSH_OPTS $master -- "nohup /bin/bash -l
\"${FLINK_BIN_DIR}/jobm
Hi,
Regarding metrics please check the "Writing an Integration test for
flink-metrics” recent mailing list question. You can either use JMXReporter or
write some custom reporter for this purpose.
Piotrek
> On 13 Oct 2017, at 20:57, Ken Krugler wrote:
>
> Hi Piotr,
>
> Thanks for responding,