Re: Spark and KafkaUtils

2016-03-15 Thread Vinti Maheshwari
assembly currently. I need to check how to use sbt >>>> assembly. >>>> >>>> Regards, >>>> ~Vinti >>>> >>>> On Wed, Feb 24, 2016 at 11:10 AM, Cody Koeninger <c...@koeninger.org> >>>> wrote: >>>>

Re: Spark and KafkaUtils

2016-02-24 Thread Cody Koeninger
line, which is >>>> a pain. >>>> >>>> On Wed, Feb 24, 2016 at 12:49 PM, Vinti Maheshwari < >>>> vinti.u...@gmail.com> wrote: >>>> >>>>> Hi Cody, >>>>> >>>>> I tried with the build file y

Re: Spark and KafkaUtils

2016-02-24 Thread Vinti Maheshwari
with your code. Otherwise >>>> you'd have to specify each separate jar in your spark-submit line, which is >>>> a pain. >>>> >>>> On Wed, Feb 24, 2016 at 12:49 PM, Vinti Maheshwari < >>>> vinti.u...@gmail.com> wrote: >>>

Re: Spark and KafkaUtils

2016-02-24 Thread Vinti Maheshwari
pecify each separate jar in your spark-submit line, which is >>> a pain. >>> >>> On Wed, Feb 24, 2016 at 12:49 PM, Vinti Maheshwari <vinti.u...@gmail.com >>> > wrote: >>> >>>> Hi Cody, >>>> >>>> I tried with the

Re: Spark and KafkaUtils

2016-02-24 Thread Cody Koeninger
>>> >>> I tried with the build file you provided, but it's not working for me, >>> getting same error: >>> Exception in thread "main" java.lang.NoClassDefFoundError: >>> org/apache/spark/streaming/kafka/KafkaUtils$ >>> >>> I am n

Re: Spark and KafkaUtils

2016-02-24 Thread Vinti Maheshwari
you provided, but it's not working for me, >> getting same error: >> Exception in thread "main" java.lang.NoClassDefFoundError: >> org/apache/spark/streaming/kafka/KafkaUtils$ >> >> I am not getting this error while building (sbt package). I am getting >

Re: Spark and KafkaUtils

2016-02-24 Thread Cody Koeninger
com> wrote: > Hi Cody, > > I tried with the build file you provided, but it's not working for me, > getting same error: > Exception in thread "main" java.lang.NoClassDefFoundError: > org/apache/spark/streaming/kafka/KafkaUtils$ > > I am not getting this error whi

Re: Spark and KafkaUtils

2016-02-24 Thread Vinti Maheshwari
Hi Cody, I tried with the build file you provided, but it's not working for me, getting same error: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils$ I am not getting this error while building (sbt package). I am getting this error

Re: Spark and KafkaUtils

2016-02-24 Thread Cody Koeninger
t; java.lang.NoClassDefFoundError: > org/apache/spark/streaming/kafka/KafkaUtils$ > > build.sbt > libraryDependencies += "org.apache.hbase" % "hbase" % "0.92.1" > libraryDependencies += "org.apache.hadoop" % "hadoop-core" % "1

Spark and KafkaUtils

2016-02-24 Thread Vinti Maheshwari
Hello, I have tried multiple different settings in build.sbt but seems like nothing is working. Can anyone suggest the right syntax/way to include kafka with spark? Error Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils$

Spark Streaming KafkaUtils Issue

2014-10-10 Thread Abraham Jacob
Hi Folks, I am seeing some strange behavior when using the Spark Kafka connector in Spark streaming. I have a Kafka topic which has 8 partitions. I have a kafka producer that pumps some messages into this topic. On the consumer side I have a spark streaming application that that has 8 executors

Re: Spark Streaming KafkaUtils Issue

2014-10-10 Thread Sean McNamara
Would you mind sharing the code leading to your createStream? Are you also setting group.id? Thanks, Sean On Oct 10, 2014, at 4:31 PM, Abraham Jacob abe.jac...@gmail.com wrote: Hi Folks, I am seeing some strange behavior when using the Spark Kafka connector in Spark streaming. I

Re: Spark Streaming KafkaUtils Issue

2014-10-10 Thread Abraham Jacob
Sure... I do set the group.id for all the consumers to be the same. Here is the code --- SparkConf sparkConf = new SparkConf().setMaster(yarn-cluster).setAppName(Streaming WordCount); sparkConf.set(spark.shuffle.manager, SORT); sparkConf.set(spark.streaming.unpersist, true);

Re: Spark Streaming KafkaUtils Issue

2014-10-10 Thread Sean McNamara
How long do you let the consumers run for? Is it less than 60 seconds by chance? auto.commit.interval.ms defaults to 6 (60 seconds). If so that may explain why you are seeing that behavior. Cheers, Sean On Oct 10, 2014, at 4:47 PM, Abraham Jacob

RE: Spark Streaming KafkaUtils Issue

2014-10-10 Thread Shao, Saisai
://issues.apache.org/jira/browse/SPARK-2492). Thanks Jerry From: Abraham Jacob [mailto:abe.jac...@gmail.com] Sent: Saturday, October 11, 2014 6:57 AM To: Sean McNamara Cc: user@spark.apache.org Subject: Re: Spark Streaming KafkaUtils Issue Probably this is the issue - http://www.michael-noll.com/blog/2014/10/01

Re: Spark Streaming KafkaUtils Issue

2014-10-10 Thread Abraham Jacob
Jerry *From:* Abraham Jacob [mailto:abe.jac...@gmail.com] *Sent:* Saturday, October 11, 2014 6:57 AM *To:* Sean McNamara *Cc:* user@spark.apache.org *Subject:* Re: Spark Streaming KafkaUtils Issue Probably this is the issue - http://www.michael-noll.com/blog/2014/10/01/kafka-spark

Re: Spark Streaming KafkaUtils Issue

2014-10-10 Thread Sean McNamara
it (https://issues.apache.org/jira/browse/SPARK-2492). Thanks Jerry From: Abraham Jacob [mailto:abe.jac...@gmail.commailto:abe.jac...@gmail.com] Sent: Saturday, October 11, 2014 6:57 AM To: Sean McNamara Cc: user@spark.apache.orgmailto:user@spark.apache.org Subject: Re: Spark Streaming KafkaUtils Issue

RE: Spark Streaming KafkaUtils Issue

2014-10-10 Thread Shao, Saisai
/browse/SPARK-2492). Thanks Jerry From: Abraham Jacob [mailto:abe.jac...@gmail.commailto:abe.jac...@gmail.com] Sent: Saturday, October 11, 2014 6:57 AM To: Sean McNamara Cc: user@spark.apache.orgmailto:user@spark.apache.org Subject: Re: Spark Streaming KafkaUtils Issue Probably this is the issue

Re: Spark Streaming KafkaUtils Issue

2014-10-10 Thread Abraham Jacob
and recompile the Spark. Thanks Jerry *From:* Abraham Jacob [mailto:abe.jac...@gmail.com] *Sent:* Saturday, October 11, 2014 8:49 AM *To:* Shao, Saisai *Cc:* user@spark.apache.org; Sean McNamara *Subject:* Re: Spark Streaming KafkaUtils Issue Thanks Jerry, So, from what I can