Re: InvalidTypesException - Input mismatch: Basic type 'Integer' expected but was 'Long'

2016-01-18 Thread Till Rohrmann
Hi Biplob, which version of Flink are you using? With version 1.0-SNAPSHOT, I cannot reproduce your problem. Cheers, Till ​ On Sun, Jan 17, 2016 at 4:56 PM, Biplob Biswas wrote: > Hi, > > I am getting the following exception when i am using the map function > > Exception in thread "main" >> or

Compile fails with scala 2.11.4

2016-01-18 Thread Ritesh Kumar Singh
[ERROR] /home/flink/flink-runtime/src/test/scala/org/apache/flink/runtime/jobmanager/JobManagerITCase.scala:703: *error: can't expand macros compiled by previous versions of Scala* [ERROR] assert(cachedGraph2.isArchived) [ERROR] ^ [ERROR] one error fo

Re: Compile fails with scala 2.11.4

2016-01-18 Thread Chiwan Park
Hi Ritesh, This problem seems already reported [1]. Flink community is investigating this issue. I think that if you don’t need Scala 2.11, use Scala 2.10 until the issue is solved. [1]: http://mail-archives.apache.org/mod_mbox/flink-user/201601.mbox/%3CCAB6CeiZ_2snN-piXzd3gHnyQePu_PA0Ro7qXUF8

Re: Security in Flink

2016-01-18 Thread Maximilian Michels
Hi Welly, There is no fixed timeline yet but we plan to make progress in terms of authentication and encryption after the 1.0.0 release. Cheers, Max On Wed, Jan 13, 2016 at 8:34 AM, Welly Tambunan wrote: > Hi Stephan, > > Thanks a lot for the explanation. > > Is there any timeline on when this

Re: Compile fails with scala 2.11.4

2016-01-18 Thread Robert Metzger
How did start the Flink for Scala 2.11 compilation ? On Mon, Jan 18, 2016 at 11:41 AM, Chiwan Park wrote: > Hi Ritesh, > > This problem seems already reported [1]. Flink community is investigating > this issue. I think that if you don’t need Scala 2.11, use Scala 2.10 until > the issue is solved

Re: Flink v0.10.2

2016-01-18 Thread Maximilian Michels
Hi Welly, hi Nick, I would be in for a Flink 0.10.2 release. The number of fixes are reasonable and we could get it out quickly. Cheers, Max On Thu, Jan 14, 2016 at 11:17 PM, Nick Dimiduk wrote: > I would also find a 0.10.2 release useful :) > > On Wed, Jan 13, 2016 at 1:24 AM, Welly Tambunan

Re: Redeployements and state

2016-01-18 Thread Maximilian Michels
The documentation layout changed in the master. Then new URL: https://ci.apache.org/projects/flink/flink-docs-master/apis/streaming/savepoints.html On Thu, Jan 14, 2016 at 2:21 PM, Niels Basjes wrote: > Yes, that is exactly the type of solution I was looking for. > > I'll dive into this. > Thanks

Re: Accessing configuration in RichFunction

2016-01-18 Thread Maximilian Michels
Hi Christian, For your implementation, would it suffice to pass a Configuration with your RichFilterFunction? You said the global job parameters are not passed on to your user function? Can you confirm this is a bug? Cheers, Max On Wed, Jan 13, 2016 at 10:59 AM, Christian Kreutzfeldt wrote: > H

Re: Accessing configuration in RichFunction

2016-01-18 Thread Christian Kreutzfeldt
Hi Max, maybe I explained it a bit mistakable ;-) I have a stream-based application which contains a RichFilterFunction implementation. The parent provides a lifecycle method open (open(Configuration)) which receives a Configuration object as input. I would like to use this call to pass options i

Flink plus Elastic Search plus Kibana

2016-01-18 Thread HungChang
Hi, Recently I read this post about Flink+Elastic Search+Kibana. https://www.elastic.co/blog/building-real-time-dashboard-applications-with-apache-flink-elasticsearch-and-kibana Can I ask why the Elastic Search version 1.7.3 is selected? What would be the potential issues with the newer versions?

Re: Accessing configuration in RichFunction

2016-01-18 Thread Robert Metzger
Hi Christian, I think the DataStream API does not allow you to pass any parameter to the open(Configuration) method. That method is only used in the DataSet (Batch) API, and its use is discouraged. A much better option to pass a Configuration into your function is as follows: Configuration mapC

Re: Flink plus Elastic Search plus Kibana

2016-01-18 Thread Maximilian Michels
Hi Sendoh, At the time the article was created, Elasticsearch 2.0 was only in the making and by the time of publishing it had just been released. That's why we used version 1.7.3. There is currently no 2.X version of the Flink adapter but that will change very soon. There is an issue and a pending

Re: Flink plus Elastic Search plus Kibana

2016-01-18 Thread HungChang
Found the answer here http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-Elasticsearch-connector-support-for-elasticsearch-2-0-td3910.html -- View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-plus-Elastic-Search-plus-

Re: InvalidTypesException - Input mismatch: Basic type 'Integer' expected but was 'Long'

2016-01-18 Thread Biplob Biswas
Hi Till, I am using flink 0.10.1 and if i am not wrong it corresponds to the 1.0-Snapshot you mentioned. [image: Inline image 1] If wrong, please suggest what should I do to fix it. Thanks & Regards Biplob Biswas On Mon, Jan 18, 2016 at 11:23 AM, Till Rohrmann wrote: > Hi Biplob, > > which v

Re: InvalidTypesException - Input mismatch: Basic type 'Integer' expected but was 'Long'

2016-01-18 Thread Till Rohrmann
Hi Biplob, no version 0.10.1 and 1.0-SNAPSHOT are different. Could you bump your Flink version to the latter and try again if you can reproduce your problem? Cheers, Till ​ On Mon, Jan 18, 2016 at 2:24 PM, Biplob Biswas wrote: > Hi Till, > > I am using flink 0.10.1 and if i am not wrong it cor

Re: Accessing configuration in RichFunction

2016-01-18 Thread Christian Kreutzfeldt
Hi Robert, using the constructor is actually the selected way. Using the existing lifecycle method was an idea to integrate it more with the existing framework design ;-) Best Christian 2016-01-18 13:38 GMT+01:00 Robert Metzger : > Hi Christian, > > I think the DataStream API does not allow y

Unexpected behavior with Scala App trait.

2016-01-18 Thread Andrea Sella
Hi, I was implementing TF-IDF example of flink-training when I faced a problem with NPE during the deploy of my Job. Source code: https://github.com/alkagin/flink-tfidf-example I used 0.10.1 version and started in local mode. During the deploy of TFIDFNPE Job, which it extends App, Flink throws

Re: Compile fails with scala 2.11.4

2016-01-18 Thread Ritesh Kumar Singh
Thanks for the replies. @Chiwan, I am switching back to scala_2.10.4 for the time being. I was using scala_2.11.4 as this is the version I've compiled spark with. But anyways, I can wait for the bug to be resolved. @Robert, the commands were as follows: $tools/change-scala-version.sh 2.11 $mvn cl

Flink Stream: collect in an array all records within a window

2016-01-18 Thread Saiph Kappa
Hi, After performing a windowAll() on a DataStream[String], is there any method to collect and return an array with all Strings within a window (similar to .collect in Spark). I basically want to ship all strings in a window to a remote server through a socket, and want to use the same socket con

Cancel Job

2016-01-18 Thread Don Frascuchon
Hi, When some streaming job is manually canceled, what's about the messages in process ? Flink's engine wait to task finish process messages inside (some like apache-storm) ? If not, there is a safe way for stop streaming jobs ? Thanks in advance! Best regards

Re: Flink Stream: collect in an array all records within a window

2016-01-18 Thread Matthias J. Sax
Hi Saiph, you can use AllWindowFunction via .apply(...) to get an .collect method: From: https://ci.apache.org/projects/flink/flink-docs-release-0.10/apis/streaming_guide.html > // applying an AllWindowFunction on non-keyed window stream > allWindowedStream.apply (new AllWindowFunction, > Integ

Re: Cancel Job

2016-01-18 Thread Matthias J. Sax
Hi, currently, messaged in flight will be dropped if a streaming job gets canceled. There is already WIP to add a STOP signal which allows for a clean shutdown of a streaming job. This should get merged soon and will be available in Flink 1.0. You can follow the JIRA an PR here: https://issues.a

Re: Cancel Job

2016-01-18 Thread Don Frascuchon
Thanks Matthias ! El lun., 18 ene. 2016 a las 20:51, Matthias J. Sax () escribió: > Hi, > > currently, messaged in flight will be dropped if a streaming job gets > canceled. > > There is already WIP to add a STOP signal which allows for a clean > shutdown of a streaming job. This should get merge

Re: Flink Stream: collect in an array all records within a window

2016-01-18 Thread Saiph Kappa
Hi Matthias, Thanks for your response. The method .writeToSocket seems to be what I was looking for. Can you tell me what kind of serialization schema should I use assuming my socket server receives strings. I have something like this in scala: val server = new ServerSocket()while (true) {

Re: Unexpected behavior with Scala App trait.

2016-01-18 Thread Chiwan Park
Hi Andrea, I’m not expert of Scala but It seems about closure cleaning problem. Scala App trait extends DelayedInit trait to initialize object. But Flink serialization stack doesn’t handle this special initialization. (It is just my opinion, not verified.) To run TFIDFNPE safely, you need to j

Results of testing Flink quickstart against 0.10-SNAPSHOT and 1.0-SNAPSHOT (re. Dependency on non-existent org.scalamacros:quasiquotes_2.11:)

2016-01-18 Thread Prez Cannady
Sent this to d...@flink.apache.org , but that might not be the appropriate forum for it. Finally got a chance to sit down and run through a few tests. Upfront, I have been able to resolve my issue sufficiently to move forward, but seems there’s an issue with the cu

Re: Results of testing Flink quickstart against 0.10-SNAPSHOT and 1.0-SNAPSHOT (re. Dependency on non-existent org.scalamacros:quasiquotes_2.11:)

2016-01-18 Thread Prez Cannady
One correction, line 3 of 1.0-SHAPSHOT source build should read “checked out master branch (snapshot version 1.0-SNAPSHOT)." Prez Cannady p: 617 500 3378 e: revp...@opencorrelate.org GH: https://github.com/opencorrelate

Re: Compile fails with scala 2.11.4

2016-01-18 Thread Prez Cannady
Assuming you haven’t already migrated back to 2.10, you might try this; $ git checkout release-0.10 $ tools/change-scala-version 2.11 $ mvn clean install -DskipTests=true -Dmaven.javadoc.skip=true -Dscala.version=2.11.4 -Dscala.binary.version=2.11 Then try building your project. Building under