RE: Spark and Kafka integration

2017-01-12 Thread Phadnis, Varun
Cool! Thanks for your inputs Jacek and Mark! From: Mark Hamstra [mailto:m...@clearstorydata.com] Sent: 13 January 2017 12:59 To: Phadnis, Varun Cc: user@spark.apache.org Subject: Re: Spark and Kafka integration See "API compatibility" in http://spark.apache.org/versioning-policy.h

Re: Spark and Kafka integration

2017-01-12 Thread Mark Hamstra
See "API compatibility" in http://spark.apache.org/versioning-policy.html While code that is annotated as Experimental is still a good faith effort to provide a stable and useful API, the fact is that we're not yet confident enough that we've got the public API in exactly the form that we want to

Re: Spark and Kafka integration

2017-01-12 Thread Jacek Laskowski
Hi Phadnis, I found this in http://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html: > This version of the integration is marked as experimental, so the API is > potentially subject to change. Pozdrawiam, Jacek Laskowski https://medium.com/@jaceklaskowski/ Mastering Apach

Spark and Kafka integration

2017-01-12 Thread Phadnis, Varun
Hello, We are using Spark 2.0 with Kafka 0.10. As I understand, much of the API packaged in the following dependency we are targeting is marked as "@Experimental" org.apache.spark spark-streaming-kafka-0-10_2.11 2.0.0 What are implications of this being marked as experimental? A

Re: Spark and Kafka direct approach problem

2016-05-04 Thread Mich Talebzadeh
This works spark 1.61, using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_77) Kafka version 0.9.0.1 using scala-library-2.11.7.jar Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Re: Spark and Kafka direct approach problem

2016-05-04 Thread Shixiong(Ryan) Zhu
It's because the Scala version of Spark and the Scala version of Kafka don't match. Please check them. On Wed, May 4, 2016 at 6:17 AM, أنس الليثي wrote: > NoSuchMethodError usually appears because of a difference in the library > versions. > > Check the version of the libraries you downloaded, t

Re: Spark and Kafka direct approach problem

2016-05-04 Thread أنس الليثي
NoSuchMethodError usually appears because of a difference in the library versions. Check the version of the libraries you downloaded, the version of spark, the version of Kafka. On 4 May 2016 at 16:18, Luca Ferrari wrote: > Hi, > > I’m new on Apache Spark and I’m trying to run the Spark Streami

Spark and Kafka direct approach problem

2016-05-04 Thread Luca Ferrari
Hi, I’m new on Apache Spark and I’m trying to run the Spark Streaming + Kafka Integration Direct Approach example (JavaDirectKafkaWordCount.java). I’ve downloaded all the libraries but when I try to run I get this error Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.Arr

RE: Spark and Kafka Integration

2015-12-07 Thread Singh, Abhijeet
For Q2. The order of the logs in each partition is guaranteed but there cannot be any such thing as global order. From: Prashant Bhardwaj [mailto:prashant2006s...@gmail.com] Sent: Monday, December 07, 2015 5:46 PM To: user@spark.apache.org Subject: Spark and Kafka Integration Hi Some

Spark and Kafka Integration

2015-12-07 Thread Prashant Bhardwaj
Hi Some Background: We have a Kafka cluster with ~45 topics. Some of topics contains logs in Json format and some in PSV(pipe separated value) format. Now I want to consume these logs using Spark streaming and store them in Parquet format in HDFS. Now my question is: 1. Can we create a InputDStre

Re: why spark and kafka always crash

2015-09-14 Thread Akhil Das
Can you be more precise? Thanks Best Regards On Tue, Sep 15, 2015 at 11:28 AM, Joanne Contact wrote: > How to prevent it? > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user

why spark and kafka always crash

2015-09-14 Thread Joanne Contact
How to prevent it? - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org

Re: Spark and Kafka

2014-11-06 Thread Eduardo Costa Alfaia
This is my window: reduceByKeyAndWindow( new Function2() { @Override public Integer call(Integer i1, Integer i2) { return i1 + i2; } }, new Function2() { public Integer call(Integer i1, Integer i2) { return i1 - i2; } }, new Duration(60

Spark and Kafka

2014-11-06 Thread Eduardo Costa Alfaia
Hi Guys, I am doing some tests with Spark Streaming and Kafka, but I have seen something strange, I have modified the JavaKafkaWordCount to use ReducebyKeyandWindow and to print in the screen the accumulated numbers of the words, in the beginning spark works very well in each interaction the nu