Thanks to all. I tried with Scala Eclipse IDE with all these
'change-scala-version.sh'. But in vain.
So I switched over to Intellij and thing work fine over there. I am new to
Intellij so will try using it.
Once again thanks for helping me out.
Regards
Ram
-Original Message-
From:
Thanks for the replies and help. Stephan, the Maven shortcut worked like a
charm :).
- As for the 50ms window duration, when I was running the WindowWordCount
example with a duration of 5ms, I encountered this error stack trace:
Exception in thread "main" java.lang.IllegalArgumentException: Windo
Greg Hogan created FLINK-3219:
-
Summary: Implement DataSet.count using a single operator
Key: FLINK-3219
URL: https://issues.apache.org/jira/browse/FLINK-3219
Project: Flink
Issue Type: Improveme
Greg Hogan created FLINK-3218:
-
Summary: Merging Hadoop configurations overrides user parameters
Key: FLINK-3218
URL: https://issues.apache.org/jira/browse/FLINK-3218
Project: Flink
Issue Type: B
We haven't defined the StreamSQL syntax yet (and I think it will take some
time until we are at that point).
So we are quite flexible with both featurs.
Let's keep this opportunity in mind and coordinate when before making
decisions about CEP or StreamSQL.
Fabian
2016-01-11 17:29 GMT+01:00 Till
First of all, it's a great design document. Looking forward having stream
SQL in the foreseeable future :-)
I think it is a good idea to consolidate stream SQL and CEP in the long
run. CEP's additional features compared to SQL boil down to pattern
detection. Once we have this, it should be only a
Till Rohrmann created FLINK-3217:
Summary: Implement NFA for pattern detection
Key: FLINK-3217
URL: https://issues.apache.org/jira/browse/FLINK-3217
Project: Flink
Issue Type: Sub-task
Thanks for the valuable feedback.
@Stephan, you're totally right that the CEP DSLs and SQLs strongly resemble
each other. It's probably mainly a question of syntax how a pattern
definition can be exposed in stream SQL. For that we should take a closer
look at Oracle's Pattern Matching extension on
Till Rohrmann created FLINK-3216:
Summary: Define pattern specification
Key: FLINK-3216
URL: https://issues.apache.org/jira/browse/FLINK-3216
Project: Flink
Issue Type: Sub-task
R
Till Rohrmann created FLINK-3215:
Summary: Add CEP library to Flink
Key: FLINK-3215
URL: https://issues.apache.org/jira/browse/FLINK-3215
Project: Flink
Issue Type: New Feature
Re
Yes I know that right nowthat's to way to compile Flink without tests but
from my perspective (as a developer) that's a workaround. If I don't want
to compile the test classes I should be able to skip it (also because in
Flink compiling tests take quite a lot).
Of course it's not a big deal, it's j
Hi Flavio!
You can always do "mvn -DskipTests clean package". That compiles tests, but
does not execute them.
Stephan
On Mon, Jan 11, 2016 at 11:34 AM, Flavio Pompermaier
wrote:
> I hope that it's not too late to suggest to add to restructuring also
> https://issues.apache.org/jira/browse/FLI
Hi Ram,
If you want to build Flink with Scala 2.10, just checkout Flink repository from
github or download source code from homepage, run `mvn clean install
-DskipTests` and import projects to your IDE. If you want to build Flink with
Scala 2.11, you have to run `tools/change-scala-version.sh 2
I hope that it's not too late to suggest to add to restructuring also
https://issues.apache.org/jira/browse/FLINK-1827 so that to be able to
compile Flink without compiling also tests (-Dmaven.test.skip=true) and
save a lot of time...
I should be fairly easy to fix that.
Best,
Flavio
On Wed, Jan
Thank you very much for the reply.
I tried different ways and when I tried setting up the root pom.xml to 2.11
2.11.6
2.11
I got the following error
[INFO]
[ERROR] Failed to execute goal on
Thanks for the feedback!
We will start the SQL effort with putting the existing (batch) Table API on
top of Apache Calcite.
>From there we continue to add streaming support for the Table API before we
put a StreamSQL interface on top.
Consolidating the efforts with the CEP library sounds like a g
16 matches
Mail list logo