If anybody is interested, I've implemented a StreamingFileSink with dynamic
paths:
https://github.com/sidfeiner/DynamicPathFileSink
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
From: Rafi Aroch
be much appreciated :)
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
log4j2 twice. Once without using the java dynamic
options and the second twice saying it required setting the java dynamic
version so I'm a bit confused here 邏
I appreciate the support btw
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
Hi,
We're using Apache Flink 1.9.2 and we've started logging everything as JSON
with log4j (standard log4j1 that comes with Flink). When I say JSON logging, I
just mean that I've formatted in according to:
log4j.appender.console.layout.ConversionPattern={"level": "%p", "ts":
"%d{ISO8601}",
hese weird and unstable numbers of % in expected increase
even though I'm not using a KeyedWindow anymore?
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
From: Arvid Heise
Sent: Tuesday, November 3,
in performance appear only for higher
parallelism?
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
From: Arvid Heise
Sent: Tuesday, November 3, 2020 12:09 PM
To: Yangze Guo
Cc: Sidney Feiner ; user
Hey, I just ran a simple consumer that does nothing but consume event event
(without aggregating) and every slot handles above 3K per second, and with
parallelism set to 15, it succesffully handles 45K events per second
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype
cause this dramatic decrease in performance?
Extra info:
* Flink version 1.9.2
* Flink High Availability mode
* 3 task managers, 66 slots total
Execution plan:
[cid:04ba7b84-819d-45b6-98cd-446127a0255b]
Any help would be much appreciated
Sidney Feiner / Data Platform Developer
M
Thanks!
What am I supposed to put in the apply/process function for the sink to be
invoked on a List of items?
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
From: tison
Sent: Sunday, March 22, 2020
in the pipeline?
Because if I have multiple sinks that that only for one of them I need a
Window, the second solution might be problematic.
Thanks :)
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
t.id" on my consumer to a random UUID, making sure
I don't have any duplicates but that didn't help either.
Any idea what could be causing this?
Thanks
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
Ok, I configured the PrometheusReporter's ports to be a range and now every
TaskManager has it's own port where I can see it's metrics. Thank you very much!
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature
are configured but not being
reported in high availability but are reported when I run the job locally
though IntelliJ?
Thanks
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
've posted this on StackOverflow as well -
here<https://stackoverflow.com/questions/59376693/different-jobname-per-job-when-reporting-flink-metrics-to-pushgateway>
:)
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature]
to filter out and then simply not
handle them?
Which means I must filter them in the task itself and I have no way of
filtering them directly from the data source?
Sidney Feiner / Data Platform Developer
M: +972.528197720 / Skype: sidney.feiner.startapp
[emailsignature
Hey,
I have a question about using metrics based on filtered data.
Basically, I have handlers for many types of events I get from my data source
(in my case, Kafka), and every handler has it's own filter function.
That given handler also has a Counter, incrementing every time it filters out
an
16 matches
Mail list logo