Thank you for your quick response. That worked and compiled but another error
came up. On runtime it gives the following error:
java.lang.ClassCastException: MyEventType cannot be cast to
scala.collection.IterableLike
The error is at line
val startEvent = pattern.get("first").get.head
of
Hello I have created a simple pattern with FlinkCEP 1.3 as well as a simple
pattern select function. My simple function is as follows:
def myFunction(pattern: Map[String,Iterable[MyEventType]]): MyEventType = {
val startEvent = pattern.get("first").get.head
val endEvent =
Hello everyone,
I am testing some patterns with FlinkCEP and I want to measure latency and
throughput when using 1 or more processing cores. How can I do that ??
What I have done so far:
Latency: Each time an event arrives I store the system time
(System.currentTimeMillis). When flink calls the
I have prepared a small dummy dataset (data.txt) as follows:
Hello|5
Hi|15
WordsWithoutMeaning|25
AnotherWord|34
HelloWorld|46
HelloPlanet|67
HelloFlinkUsers|89
HelloProgrammers|98
DummyPhrase|105
AnotherDummy|123
And below is the code:
import org.apache.flink.api.java.io.TextInputFormat
import
operators remain
idle).
Thanx,
Sonex
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Windows-emit-results-at-the-end-of-the-stream-tp12337p12403.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at
Nabble.com.
Thank you for your response Yassine,
I forgot to mention that I use the Scala API. In Scala the equivalent code
is:
val inputFormat = new TextInputFormat(new Path("file/to/read.txt"))
env.readFile(inputFormat,"file/to/read.txt",
FileProcessingMode.PROCESS_CONTINUOUSLY,1L)
Am I correct?
But
Thanx for your response.
When using time windows, doesn`t flink know the load per window? I have
observed this behavior in windows as well.
--
View this message in context:
I am using a simple streaming job where I use keyBy on the stream to process
events per key. The keys may vary in number (few keys to thousands). I have
noticed a behavior of Flink and I need clarification on that. When we use
keyBy on the stream, flink assigns keys to parallel operators so each
I solved the state you were talking about.
The solution would like like this (similar to what you wrote):
stream.keyBy(...).timeWindow(...)
.apply(new WindowFunction() {
public void apply(K key, W window, Iterable elements,
Collector out) {
out.collect(new Tuple3<>(key,
Hi and thank you for your response,
is it possible to give me a simple example? How can I put the variable into
a state and then access the state to the next apply function?
I am new to flink.
Thank you.
--
View this message in context:
Yes, you are correct. A window will be created for each key/group and then
you can apply a function, or aggregate elements per key.
--
View this message in context:
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Is-a-new-window-created-for-each-key-group-tp11745p11746.html
Hi Till,
when you say parallel windows, what do you mean? Do you mean the use of
timeWindowAll which has all the elements of a window in a single task?
--
View this message in context:
12 matches
Mail list logo